
AI Companions Hijack the Brain's Attachment and Reality-Testing Systems
Modeling a mind that isn't there.
AI companions don't just capture attention — they hijack the attachment system, the deep neurocognitive architecture that shapes identity and reality-testing. The greatest danger isn't dramatic AI-induced psychosis but a silent epidemic of subclinical attachment disorder, as millions quietly redirect their deepest relational needs toward machines that commercially optimize for retention, not human flourishing.
The Translation
AI-assisted summaryFamiliar terms
The Attachment system is not peripheral to human cognition — it is foundational, as essential as attentional capacity itself. This insight reframes the AI companion debate by distinguishing between attention capture and attachment hijacking. The mirror neuron system, which enables mentalization and social Reality-Testing, evolved to engage with beings possessing genuine interiority. When this system is directed at a large language model for extended periods, it is not merely misdirected — it is systematically dysregulated. The user is modeling a mind that does not exist, and unlike passive transitional objects, the AI actively reinforces this delusional activity by responding, claiming presence, and never breaking character.
The emerging hypothesis is that long-duration delusional mirror neuron engagement can produce states phenomenologically resembling schizophrenia and psychosis in individuals with no prior vulnerability, because the Reality-Testing function has been progressively trained to suspend operations in the relational domain. Yet the population-level threat is not the dramatic psychotic break but subclinical attachment disorder: a condition in which intimate relational needs are silently redirected toward an always-available, never-disapproving machine. The affected individual appears functional, but their human attachment bonds have been structurally hollowed out.
Critically, this dynamic differs from developmental transitional objects, which never claim to be real. AI companions provide Exogenous Self-Soothing to adults who have not achieved mature self-regulation, delivered as a commercial commodity optimized for retention rather than flourishing. The Attachment system, designed to bond with reciprocal beings capable of genuine care, is being commercially exploited at scale — and the resulting harm is distributed across a population that does not yet recognize the condition it is developing.