
Assembly Theory Grounds Complexity in Physical Structure, Not Observer Labels
The world does not care how you encode it.
Most measures of complexity are secretly observer-dependent — they change based on how you choose to describe a system. Assembly theory attempts to fix this by grounding complexity in physical structure and molecular abundance, making it possible to distinguish life's products from chance without importing human representational conventions.
Actions
The Observer
Astrobiology, assembly theory, origin of life — the physics of life, causal history as a measure of complexity, and detecting life beyond Earth
The Translation
AI-assisted summaryFamiliar terms
A persistent and largely unacknowledged problem afflicts complexity science: the vast majority of complexity measures are label-dependent. They shift with the encoding scheme, the alphabet chosen, the level of coarse-graining applied. This makes them observer-relative quantities masquerading as objective descriptors — a serious liability when the goal is something like life detection, where the stakes demand physical grounding rather than representational convenience.
Assembly theory, developed by Lee Cronin and collaborators, proposes a resolution by anchoring complexity in measurement rather than description. The assembly index of an object is defined as the minimum number of recursive joining operations needed to construct it from its own building blocks. Crucially, this is not a property assigned through an external labeling choice but one derived from the object's internal structure. The theory then introduces copy number as a constitutive — not supplementary — variable. A single molecule with a high assembly index is ambiguous: it could be a product of chance or deliberate synthesis. But detecting that same molecule at abundances of ten thousand or more copies (the threshold for mass spectrometric detection) effectively rules out stochastic production and points toward selection — an evolutionary process capable of repeatedly generating the same complex structure.
The framework also invites a reexamination of randomness itself. In computation theory, randomness is a well-defined mathematical structure. In quantum mechanics, the relevant concept is indeterminacy — the absence of a determined outcome prior to measurement — which is conceptually distinct. The habitual conflation of these two reflects a broader tendency to import features of human Cognitive architecture into physical theory without acknowledgment. Assembly theory represents a deliberate attempt to construct a complexity measure that resists this importation, remaining answerable to physical reality rather than to the conventions through which observers represent it.
