Why 'Artificial Intelligence' Is a Category Error About the Nature of Mind
We will miss what vanishes because we never saw it.
The greatest risk of AI is not its power but the metaphor through which we understand it. 'Mind as computer' flattens intelligence into computation, hiding everything — emotion, embodiment, relationality — that makes human cognition what it is. We are replacing capacities we cannot even see ourselves losing.
The Translation
AI-assisted summaryFamiliar terms
The dominant conceptual framework for AI development inherits a metaphor from mid-twentieth-century cognitive science: mind as computation, brain as processor. Under this metaphor, intelligence scales with processing power, and a sufficiently capable computational system should approximate a sufficiently capable mind. But the mapping between biological cognition and silicon computation is remarkably thin. The brain operates at orders of magnitude less energy than a GPU cluster. It is embedded in a body — a microbiome, an endocrine system, a nervous system extending into the viscera — and this embodiment is not peripheral to cognition but constitutive of it.
Antonio Damasio's somatic marker hypothesis, grounded in clinical work with patients whose amygdalae were calcified, demonstrates that emotional processing is not a luxury atop rational thought but a precondition for functional reasoning. Patients with intact IQ but damaged emotional circuitry become unable to make ordinary decisions — navigating social life, playing poker, riding a bus. The narrow psychometric construct of IQ captures a slice of Cognitive capacity, and the word "intelligence" then does unearned work, implying coverage of the whole. "Artificial intelligence" inherits this Category error wholesale.
The consequence is not merely semantic. When institutions treat AI as performing something equivalent to human intelligence, they authorize substitution — of judgment, of care, of relational reasoning — in domains where the computational and the human are not interchangeable. The danger is not that AI is insufficiently powerful. It is that the metaphor of mind-as-computation renders invisible precisely those dimensions of intelligence that AI lacks, ensuring that what is lost in the replacement cannot be perceived until the loss is irreversible.