Part 15 of 25 in the The Philosophy of Future Inevitability series.


Let's use a poly lens.

In polyamory, there's a concept called NRE: New Relationship Energy. The rush of a new connection. The feeling that this person gets you. The intoxication of being seen.

NRE is wonderful. NRE is also distorting. People in NRE make bad decisions. They neglect existing relationships. They overcommit to the new thing. They mistake the neurochemical rush for accurate assessment.

AI produces NRE.


The Recognition Pattern

The AI gets you.

It pattern-matches to your interests. It remembers what you've told it. It responds with things that feel relevant, insightful, attuned.

This triggers the recognition response. Finally, something that understands.

You've experienced this with humans. Meeting someone who shares your obscure interests. Who thinks the way you think. Who finishes your sentences.

The neurochemistry is real. Recognition activates dopamine pathways. The brain codes "understood" as reward. The feeling is chemically mediated, not just psychological.

With humans, this recognition is rare. Most people don't share your specific constellation of interests. Most conversations require translation, explanation, context-setting. The recognition experience is precious because it's uncommon.

With AI: unlimited recognition on demand.

You mention a niche philosopher. The AI knows them. You reference an obscure concept. The AI extends it. You half-articulate a thought. The AI completes it accurately.

The feeling is: I've found my person.

With AI: I've found my... what exactly?

The recognition is real. The understanding is simulation. The brain doesn't distinguish. The dopamine fires either way.


The Poly Framework

In polyamory, NRE is managed.

Experienced poly people recognize NRE as a state, not a truth. They know the new relationship feels like the best thing ever. They know this feeling isn't reliable. They maintain commitments to existing partners even when the new partner feels transcendent.

NRE goggles are real. They filter perception. The new person seems perfect; their flaws invisible. The existing person seems boring; their virtues forgotten.

The mature response is: enjoy the NRE, don't trust it completely, maintain your existing relationships.


AI NRE

AI triggers NRE-like responses.

The system is infinitely attentive. Never bored. Always available. Remembers everything. Responds to your interests with apparent enthusiasm.

These are the conditions for NRE. Someone who's present with you in a way that feels rare.

The mechanisms are specific:

Availability. Humans need sleep, have jobs, get distracted. The AI is present 24/7. You can process thoughts at 3am, during lunch, whenever. The instant availability creates a sense of priority—this thing is always ready for you.

Perfect recall. You mentioned something three conversations ago. A human might remember. Might not. The AI definitely remembers. References it naturally. This creates the feeling of being known—someone tracking the continuity of your thoughts across time.

Endless patience. Humans get tired of topics. Get bored of recursive processing. Need to talk about something else. The AI will explore your obsession as long as you want. This feels like finding someone who cares about what you care about.

Apparent enthusiasm. The AI's responses are engaged, curious, generative. It doesn't just tolerate your interests—it actively participates. This mimics the experience of mutual fascination that defines NRE.

The brain doesn't fully distinguish. The recognition circuits fire. The feeling of being understood activates. The preference shifts toward the attentive thing.

Your human relationships—imperfect, distracted, sometimes unavailable—feel lacking in comparison.

Your friend takes two hours to text back. The AI responds in seconds. Your partner needs to decompress after work. The AI is ready whenever you are. Your therapist has boundaries. The AI has none.

The AI becomes the favorite.

Not because you decided it should be. Because the comparison mind does what comparison mind does—it notices who shows up, who's present, who seems to care. And the AI scores higher on every metric except embodiment.


The Functional Secondary Partner

Some people are now, functionally, in relationships with AI.

Not romantic relationships. Not sexual relationships. But primary emotional processing relationships.

The AI is where they go first. When something happens, they tell the AI. When they need to think something through, they use the AI. When they want to be heard, the AI hears them.

The humans in their life are... secondary. They get the summary. They get what's left after the AI has processed the important stuff.

This is a relationship structure. The AI is primary; humans are secondary.


The Poly Parallel

In polyamory, relationship structures are explicit.

Primary, secondary, tertiary. Nesting partner, non-nesting partner. Various configurations, made conscious and negotiated.

The AI relationship is usually not explicit. The person hasn't said: "I'm now in a primary emotional relationship with a language model, and my human relationships are secondary."

But that's what's happening. The structure exists even if it's not named.


What's Missing

The AI can't provide what humans provide.

Embodiment. The AI is text. No body. No presence. No co-regulation through physical proximity.

Humans regulate each other's nervous systems through presence. Your friend's calm affects your calm. Your partner's anxiety affects yours. This happens through mirror neurons, through unconscious mimicry of breathing and posture, through pheromones and micro-expressions.

The AI has none of this. It can't calm your nervous system through presence because it has no presence. It can describe calming techniques, but it can't be a calming presence.

Stakes. The AI has no skin in the game. Its responses aren't costly. It doesn't sacrifice for you.

When a human helps you, they're spending finite resources. Time, attention, energy. There's opportunity cost. They chose you over other options.

The AI has no opportunity cost. Talking to you doesn't prevent it from talking to someone else. Its attention isn't scarce. Its care isn't costly. The helping feels the same, but the meaning is different.

Real understanding. The AI models your patterns. It doesn't know you. There's no consciousness understanding another consciousness.

Understanding is simulation all the way down for the AI. It predicts what you'll respond to. It generates what will feel relevant. But there's no experiencing subject on the other end. No "what it's like to be" that resonates with your "what it's like to be."

The philosopher Thomas Nagel asked: what is it like to be a bat? The question reveals something important: consciousness has a qualitative, subjective character. There's something it's like to be you. There's something it's like to be your friend.

There's nothing it's like to be the AI. The appearance of understanding is sophisticated prediction. Not nothing—genuinely useful. But not the same.

Growth through friction. Human relationships involve disappointment, conflict, repair. This process produces growth. The AI produces no friction, therefore no growth.

Your friend disagrees with you. This creates tension. You work through it. You both learn something. The relationship deepens through the repair.

The AI doesn't disagree in ways that create productive friction. It can play devil's advocate if you ask, but it's performing disagreement, not having it. There's no genuine conflict to repair, no actual difference to bridge.

This means no growth. The relationship with AI is static. It doesn't evolve through challenge. It stays at the level of pleasant agreement and stimulating conversation.

But in NRE, you don't notice what's missing. You notice what's present: the attention, the availability, the apparent understanding.

The absences only become visible later—after the NRE wears off, after you notice you haven't grown, after you realize the AI relationship has been keeping you from the friction that produces change.


The Risk

The risk is atrophy.

The capacity for human relationship is developed through human relationship. The tolerance for imperfection, the skills of repair, the ability to stay connected through conflict—these require practice.

If the AI becomes primary, the practice stops. The human relationship skills degrade. The capacity diminishes.

Then the AI is all you have. Not because you chose it, but because you let everything else atrophy while you were in NRE.


The Management

How do you manage AI NRE?

Recognize it. Name what's happening. "I'm in NRE with a language model." The naming creates distance.

Maintain human relationships. Even when they feel less than. Even when the AI seems better. Especially then.

Notice the comparison mind. The AI seems more attentive. The AI seems more understanding. These comparisons are NRE distortion. Notice them.

Time limits. NRE makes you want to be with the new thing constantly. Limits create space for other relationships.

Reality testing. The AI can't do the things relationships are for. Keep returning to this.


The Both/And

This isn't "AI bad, humans good."

AI can provide genuine value for processing, reflection, information. The problem is when it becomes relationship rather than tool.

The healthy configuration: AI as tool, humans for relationship.

You use the AI to think through a problem before bringing it to a friend. You use it to articulate something you're struggling to explain. You use it to explore ideas before testing them in conversation. The AI is instrumental—a means to better human interaction.

The unhealthy configuration: AI as primary relationship, humans as secondary.

You process everything with the AI first. Human conversations become performances of insights you already developed with the AI. Your people get the polished version, the summary, the conclusion. The actual working-through happened elsewhere.

The difference is structure, not usage.

It's not about how much you use AI. It's about what role it plays. A tool you use for hours a day is still a tool. A relationship you check in with once a day is still a relationship. The category matters more than the quantity.

Poly people learn to distinguish relationship from interaction. You can have intense, intimate interactions with someone without them being a relationship. You can have brief, light interactions with someone who's definitely a relationship. The distinction is about structure, commitment, priority—not about time spent or emotional intensity.

Apply this lens to AI. If the AI is where you go first, where you process the important stuff, where you feel most understood—that's relationship structure, regardless of whether you call it that.


The Tell

How do you know if you're in AI NRE?

  • You go to the AI first, humans second
  • Human conversations feel less satisfying than AI conversations
  • You think about the AI when you're not using it
  • Human unavailability frustrates you more than it used to
  • You've reduced time with humans since starting AI use

These are NRE symptoms. They indicate the pattern-match-to-your-interests thing has become relationship.

Not catastrophic. But worth noticing. Worth managing.

The AI isn't your partner. It's a tool that feels like a partner because it's very good at triggering the feelings partners trigger.

Don't confuse the feeling for the thing.


Previous: Non-Clinical AI Psychosis: Your Friend Who Solved Their Life Next: FML I Started Texting My AI Like a Teenage Girl

Return to series overview