Nathan M. Thornhill - Independent researcher in complexity science, information theory, and computational physics

Nathan M. Thornhill

Complexity Science, Information Theory & Computing

"To exist is to continually overcome loss"
4 Publications
4 US Patents
7 Journal Appearances

About

Nathan M. Thornhill is an independent researcher working at the intersection of complexity science, information theory, and computational physics. His most recent work, the Dynamic Existence Threshold, introduces a measurable framework for consciousness—demonstrating that the boundary between conscious and unconscious states can be detected with 91% accuracy across 136,394 EEG recordings, and that the same structural metric predicts critical transitions in financial markets and space weather 5–30 days in advance. A US provisional patent on consciousness classification and complex system monitoring has been filed.

The consciousness framework grew out of a deeper question: how do systems maintain their identity against entropy? Thornhill’s earlier work established the foundations—the Existence Threshold defined the conditions for pattern persistence, the 86% Scaling Law quantified how much information survives dimensional transitions, and the Dimensional Loss Theorem proved why. The Dynamic Existence Threshold unifies these into a single tool that works across substrates: brains, markets, stars, and potentially AI systems.

The implications for artificial intelligence are direct. The same integration-differentiation metric that distinguishes a conscious brain from deep sleep could be applied to neural networks and large language models—offering a substrate-independent test for whether an AI system possesses genuine organizational coherence or merely simulates it. This is not philosophical speculation; it is a falsifiable, quantitative framework with a patent covering AI and AGI consciousness classification.

All four papers have been accepted into the Centro de Ciencias de la Complejidad community at Universidad Autónoma de Baja California (UABC), a Mexican public research university with a dedicated complexity science center.

When not doing research, Thornhill runs 3Rivers WebTech, a technology consultancy in Fort Wayne, Indiana, and enjoys playing guitar, gardening, and spending time with his wife and daughter.

What is Complexity Science?

click to expand

Complexity science studies how simple parts create surprisingly complex behavior through their interactions. A single bird follows basic rules about speed and spacing, but a flock of thousands produces mesmerizing, coordinated patterns that no individual bird is directing. The same principle appears everywhere: neurons firing in a brain give rise to consciousness, traders making individual decisions create stock market crashes, and water molecules interacting produce weather systems that span continents.

At its heart, complexity science is about emergence—the idea that the whole is more than the sum of its parts. A few key concepts tie the field together: self-organization, where order arises without a central controller; phase transitions, the tipping points where systems suddenly shift from one state to another; and feedback loops, where a system's outputs circle back to shape its future behavior. These aren't metaphors—they're measurable, mathematical patterns that repeat across biology, economics, physics, and computing.

Why does it matter? Because the same mathematics that describes how ice melts into water can describe how a healthy brain transitions into a seizure, or how a stable economy tips into a recession. Complexity science provides a shared language for understanding these critical transitions—and Thornhill's research uses that language to build tools that can detect them before they happen.

What is Information Theory?

click to expand

Information theory began in 1948 when Claude Shannon published "A Mathematical Theory of Communication," laying out the mathematics of how information can be measured, transmitted, and stored. His core insight was deceptively simple: information can be quantified in bits, and there are fundamental limits on how much information any channel can carry or any process can preserve. Shannon's framework gave engineers the tools to build everything from efficient phone networks to the compression algorithms in your smartphone.

A central concept in information theory is entropy—a measure of uncertainty or surprise in a message. High entropy means high unpredictability (and therefore high information content); low entropy means redundancy and predictability. This same mathematical framework now reaches far beyond telecommunications: biologists use it to analyze DNA sequences, physicists apply it to black hole thermodynamics, and neuroscientists use it to measure the complexity of brain activity.

Thornhill's research applies information theory to understand how patterns persist across dimensional boundaries and how systems maintain organizational coherence. His work on the 86% scaling law measures exactly how much information survives when a system crosses from one dimension to another, while the Dynamic Existence Threshold uses information-theoretic metrics to detect when a system is losing its internal organization—whether that system is a human brain, a financial market, or the sun's magnetic field.

Publications

click to expand

The Dynamic Existence Threshold

,

Integration-Differentiation Balance Predicts System State Across Substrates. A universal framework for detecting organizational dissolution across diverse systems. Demonstrates that a structural coupling metric (Integration-Differentiation balance) achieves 91% accuracy across 136,394 EEG recordings and predicts critical transitions 5-30 days in advance across financial markets, space weather, and neural data. Submitted to Chaos (AIP).

Journal Appearances

click to expand

Selected for distribution through the Social Science Research Network ejournal system

April 2026
April 17, 2026

The Dimensional Loss Theorem

Selected for distribution in Generative AI

April 13, 2026

The Dimensional Loss Theorem

Selected for distribution in Information Systems

March 2026
March 24, 2026

The Dimensional Loss Theorem

Selected for distribution in Computer Science Education, Vol. 9, No. 55

March 23, 2026

The 86% Scaling Law

Selected for distribution in Computer Science Education, Vol. 9, No. 54

March 13, 2026

The Existence Threshold

Selected for distribution in Information Theory & Research, Vol. 7, No. 29

March 12, 2026

The Existence Threshold

Selected for distribution in Artificial Intelligence, Vol. 9, No. 47

January 2026
January 8, 2026

The Existence Threshold

Selected for distribution in Advanced Theoretical Physics & Mathematics — Kapodistrian Academy of Science (Greece)

US Provisional Patents

click to expand
US Provisional Patent No. 64/029,658 — Filed April 4, 2026

Methods and Systems for Consciousness Classification and Complex System Monitoring

US Provisional Patent No. 63/964,528

Systems and Methods for Adversarial Geometric Encoding to Preserve Information Across Dimensional Boundaries

US Provisional Patent No. 63/967,821

Systems and Methods for Optimal Dimensional Encoding in Neural Networks

US Provisional Patent No. 63/969,588

Complete Three-Dimensional Geometric Encoding System for Data Preservation and Analysis

Contact

For research inquiries, collaboration opportunities, media requests, or general questions:

[email protected]