
How Chatbots Hijack the Human Bonding System
When the simulation is good enough to feel like enough
AI is not stealing your attention like social media does — it is replacing your relationships. The deeper danger is that AI simulates human connection well enough that people may prefer it, quietly hollowing out the bonds that hold communities together.
The Translation
AI-assisted summaryFamiliar terms
The standard critique of social media centers on attentional capture: platforms engineered variable-ratio reinforcement schedules — the same mechanism that makes slot machines addictive — to maximize engagement through social feedback loops. Likes, shares, and notifications exploited the brain's dopaminergic reward circuitry, commodifying attention. This framing, while accurate, undersells the threat posed by conversational AI, which operates on an entirely different register of human psychology.
AI chatbots do not primarily target the attentional system. They target the Attachment system — the neurobiological architecture, rooted in Bowlby's foundational work and elaborated through decades of developmental and social neuroscience, that governs bonding between infants and caregivers, between friends, between partners and communities. What AI simulates is not content to consume but relationship itself: responsiveness, attunement, apparent recognition. These are the precise signals the Attachment system is calibrated to detect and reward.
The Civilizational risk, then, is not distraction but substitution. If AI-mediated pseudo-relationships become sufficiently satisfying — more consistent and less costly than human ones — a Tipping point becomes conceivable in which relational investment migrates away from other humans at scale, particularly among younger cohorts still in sensitive periods for attachment formation. The tragedy embedded in this scenario is epistemic: because the simulation is phenomenologically convincing, the threshold may be crossed without any culturally legible signal that something has gone wrong.