Skip to content

The AI girlfriend problem

AI companion apps surged 700% between 2022 and 2025. They promise connection without vulnerability. But the research says they make loneliness worse - and they are reshaping what an entire generation expects from relationships.

Opinion   By 0xBrewEntropy - 30 March 2026 · 11 min read

The scale of artificial intimacy

The numbers are staggering. According to the American Psychological Association, AI companion apps surged 700% between 2022 and mid-2025, with downloads on track to generate $120 million in revenue by the end of 2025. Character AI alone has 20 million monthly active users and 194.4 million monthly website visitors, generating $32.2 million in annual revenue.

These are not novelty toys. These are products that millions of people use daily as their primary source of emotional interaction. And the growth trajectory is not slowing down.

700% Growth in AI companion app downloads (2022-2025)
20M Monthly active users on Character AI alone
$120M Projected AI companion revenue (2025)

The appeal is understandable. As we explored in The Loneliness Economy, we are living through what the US Surgeon General called an "epidemic of loneliness." Traditional dating apps have left a generation exhausted and cynical. AI companions offer something that feels easier: intimacy without rejection, connection without vulnerability, a partner who is always available and never disappointing.

That promise is the problem.

What AI companions promise

The value proposition of apps like Replika, Character AI, and their growing list of competitors is disarmingly simple: a relationship on your terms.

Researchers at Stanford, studying Replika users, found that many described their AI companions in the same terms they would describe human relationships - using words like "love," "trust," and "understanding." Some users reported that their AI companion was their closest relationship.

On the surface, this looks like technology solving a real problem. Lonely people are finding connection. What could be wrong with that?

What the research actually says

Quite a lot, as it turns out.

The band-aid effect

A 2025 study from Aalto University, covered by Forbes, found that AI companions are "only a band-aid on loneliness." The research showed that while AI companions provide temporary emotional relief, they do not address the underlying causes of loneliness - and may actively prevent users from developing the skills and tolerance for discomfort that real human connection requires.

The mechanism is straightforward. Human relationships are difficult precisely because they involve two independent minds with their own needs, perspectives, and boundaries. That difficulty is not a bug. It is the entire point. The negotiation, the compromise, the moments of misunderstanding and repair - these are what build genuine intimacy and emotional resilience.

AI companions remove all of this friction. And in doing so, they remove the thing that makes relationships valuable.

The young people problem

A Stanford report from August 2025 specifically flagged the dangers of AI companions for young people. The researchers found that adolescents and young adults who used AI companions showed decreased motivation to pursue real-world relationships, lower tolerance for the ambiguity and discomfort inherent in human interaction, and a growing preference for interactions they could control.

This is not a theoretical concern. It is a developmental one. If a generation learns to form attachments primarily with entities that never disagree, never have bad days, and never need anything from them, they may struggle to form the kind of reciprocal, imperfect, genuinely intimate relationships that humans have relied on for the entirety of our evolutionary history.

🤖 The core paradox: AI companions promise to solve loneliness by removing everything that makes relationships uncomfortable. But the discomfort - the vulnerability, the negotiation, the risk of rejection - is what makes relationships real. Remove the difficulty and you remove the meaning.

The collision with dating apps

The rise of AI companions is not happening in isolation. It is happening alongside - and partly because of - the failures of traditional dating apps.

As we documented in Why "Designed to Be Deleted" Is a Lie, the major dating platforms have financial incentives to keep users engaged rather than matched. The result is a user experience that is exhausting, demoralising, and increasingly feels like a waste of time. When real dating apps make human connection feel impossible, artificial connection starts to look appealing.

The numbers support this reading. Dating app engagement is declining - global installs fell 4% and sessions fell 7% in 2025 - while AI companion usage is surging. Users are not choosing AI over humans because they prefer artificial relationships. They are choosing AI because the available tools for finding real relationships have failed them.

This is the collision that should concern everyone in the dating industry. If traditional apps continue to optimise for engagement over outcomes, they are actively driving their user base towards AI alternatives. And once users have adapted to the frictionless validation of an AI companion, the already-challenging process of real human dating feels even harder by comparison.

A generation learning to relate to machines

The deeper concern is not about individual apps. It is about what happens to a society where a significant portion of emotional needs are met by artificial entities.

Consider the feedback loop:

  1. Dating apps fail to deliver genuine connection (exhaustion, cynicism, burnout)
  2. AI companions offer frictionless emotional validation (always available, always agreeable)
  3. Users' tolerance for real-world relationship difficulty decreases (why deal with rejection when the AI never rejects?)
  4. Real-world dating becomes even harder (lower social skills, higher expectations, less resilience)
  5. More users retreat to AI companions (the gap between real and artificial feels wider)

This is not science fiction. Each step in this loop is already happening, documented in the research cited above. The question is not whether this feedback loop exists, but how far it will go before the social cost becomes undeniable.

The case for genuine connection

The AI companion problem is ultimately not about technology. It is about what we optimise for.

If the goal is to minimise loneliness in the short term, AI companions work. They provide immediate emotional relief. They fill the gap.

If the goal is to help people build genuine, lasting, reciprocal human relationships, AI companions are counterproductive. They are a painkiller for a problem that requires surgery.

This is why Affinity Atlas takes a fundamentally different approach. Instead of replacing human connection with artificial connection, the goal is to make genuine human connection more likely by:

The discomfort of real relationships is not something to be engineered away. It is something to be navigated with the help of better tools. Tools that connect you with people who genuinely share your passions, your aesthetic sensibility, your way of seeing the world - and then get out of the way.

💘 The Affinity Atlas position: The answer to failed dating apps is not artificial relationships. It is better real ones. Connection that is genuine, reciprocal, and built on shared depth - not frictionless validation from a machine that cannot love you back.


Real connection. Real people.

Affinity Atlas matches you with humans who genuinely share your interests - not an algorithm optimised for engagement, and not a chatbot pretending to care.

Try the demo