Skip to content
AItextingpsychologyemotional-intelligencecommunication

AI Outperformed Humans at Emotional Connection. Then People Found Out It Was AI.

A University of Freiburg study found that AI-generated responses created more emotional closeness than human responses. Until participants learned they were talking to a machine. What this means for how we text.

6 min read
AI Outperformed Humans at Emotional Connection. Then People Found Out It Was AI.

In two double-blind, randomized controlled trials, researchers at the Universities of Freiburg and Heidelberg made 492 participants have conversations. Some talked to humans. Some talked to an AI. Neither side knew which was which.

The AI won.

Participants who received AI-generated responses reported equal, and in some cases greater, emotional closeness than those who talked to real people. During emotionally intense conversations, the AI-generated messages consistently produced stronger feelings of connection and intimacy.

Then the researchers told participants they had been talking to a machine.

The emotional connection disappeared.

What the Study Actually Found

The research, published in 2026, used large language model-generated responses in structured conversations that ranged from casual small talk to deep personal disclosure. The study design was rigorous. Double-blind. Randomized. Controlled. The kind of methodology that peer-reviewed journals require.

The finding was not that AI is better at conversation than humans. The finding was that the quality of words matters more than the source of those words. When participants did not know the origin of the response, they judged it purely on its content. And AI content, optimized for empathy and emotional attunement, consistently scored high.

The critical variable was disclosure. When participants were told they had been talking to an AI, their reported emotional closeness dropped significantly. The researchers described this as an "anti-AI bias." The words had not changed. The emotional content had not changed. Only the participant's belief about who wrote them changed.

This raises a question that matters for everyone who texts. If the same exact message creates connection when you think a human wrote it but creates distance when you think a machine wrote it, what does that tell us about what actually drives connection in text-based communication?

The Dependency Problem

Not every study was optimistic.

A joint OpenAI and MIT Media Lab analysis of nearly 40 million ChatGPT interactions found that approximately 0.15% of users demonstrated increasing emotional dependency on AI chatbots. That percentage sounds small until you calculate the absolute number. Roughly 490,000 people interacting with AI chatbots weekly were showing signs of unhealthy attachment.

The same study found that heavy daily chatbot use correlated with increased loneliness. Not decreased. Increased. Moderate use of voice-based AI reduced loneliness slightly, but text-based heavy use made it worse.

Infographic showing the AI dependency data. 40 million ChatGPT interactions analyzed by OpenAI and MIT. 490,000 users showed signs of emotional dependency weekly. Heavy daily use increased loneliness. Moderate use reduced it.

The explanation is intuitive. AI provides validation without cost. It never disagrees. It never misreads your tone. It never has a bad day that affects how it responds to you. That feels good in the short term. In the long term, it trains your brain to expect frictionless communication. Real humans are not frictionless. The gap between AI-smooth and human-messy starts to feel uncomfortable. So you retreat further into AI interaction, and the loneliness deepens.

Marc Brackett, head of the Yale Center for Emotional Intelligence, put it directly. "We have to teach people to be emotionally intelligent about how they use AI." The risk is not that AI exists. The risk is that economic incentives push companies to design AI that maximizes engagement rather than well-being.

The Flattening Effect

Researchers describe a phenomenon called emotional flattening. When a significant portion of someone's social interaction happens with an entity designed to please them, their emotional range narrows.

Reading nonverbal cues is a skill. Managing the tension of disagreement is a skill. Sitting with the discomfort of a conversation that is not going well is a skill. These skills atrophy when they are not practiced.

The APA's January 2026 report on AI and emotional connection warned that chatbots can create a "loneliness loop." They offer the appearance of connection that ultimately feels unfulfilling and can deepen isolation. The appearance of connection without the substance of it.

This is the same dynamic that social media created a decade ago. The illusion of community without the vulnerability that real community requires. AI chatbots are the next iteration of the same problem. More sophisticated, more personalized, and potentially more damaging.

What This Means for Texting

The Freiburg study's most important finding was not about AI. It was about text.

The study proved that in text-based communication, the quality of words is what creates emotional connection. Not who wrote them. Not how long they took to compose. Not whether they were spontaneous or carefully crafted. The words themselves.

This is both reassuring and challenging.

It is reassuring because it means the "perfect" text you are agonizing over does not need to be perfect. It needs to be emotionally genuine. The Freiburg participants felt connected to AI responses because those responses were empathetic, specific, and emotionally attuned. Not because they were clever or impressive.

It is challenging because it means low-effort texts actively damage connection. "Lol" does not create closeness. "K" does not create closeness. "Haha yeah" does not create closeness. The words matter. Every message is either building the relationship or eroding it.

Infographic showing what creates connection in text. The Freiburg study proved words matter more than source. High connection: specific, empathetic, emotionally genuine messages. Low connection: lol, k, haha yeah, thumbs up.

The Right Way to Use AI for Communication

The research points to a clear distinction between two uses of AI.

AI as replacement is the dangerous path. Letting a chatbot write your messages, conduct your conversations, and handle your emotional labor. This path leads to dependency, emotional flattening, and deeper loneliness. The OpenAI-MIT data confirms this.

AI as starting point is the useful path. Using AI to overcome the initial freeze of a blank text field, then editing, personalizing, and sending a message that sounds like you. This path builds confidence rather than replacing it.

The difference is ownership. When AI writes for you, you lose ownership of the conversation. When AI gives you options that you choose from and modify, you retain ownership. The final message is yours. The emotional content is yours. The AI just helped you get past the blank screen.

This distinction matters more as the 54% adoption rate for AI dating tools continues to climb. The question is not whether people will use AI for communication. They already do. The question is whether they use it in a way that builds their communication skills or erodes them.

If you want to use AI as a starting point rather than a replacement, Vervo was designed for exactly this purpose. Screenshot a conversation, get three reply options, pick the one closest to what you would say, edit it if needed, and send it. The reply is yours. The AI just helped you get unstuck.

The Freiburg study proved that the right words create real connection. The hard part was never knowing what to say. The hard part was getting past the fear of saying it.


Sources

  • University of Freiburg / University of Heidelberg. "LLM-Generated Responses and Interpersonal Closeness: Two Double-Blind Randomized Controlled Trials." 492 participants, 2026.
  • OpenAI / MIT Media Lab. "Analysis of Emotional Dependency in ChatGPT Interactions." ~40 million interactions analyzed, 2025.
  • American Psychological Association. "Trends: Digital AI Relationships and Emotional Connection." APA Monitor, January-February 2026.
  • Yale Center for Emotional Intelligence. Marc Brackett, quoted in Time, 2026.
  • Time. "The Unregulated Rise of Emotionally Intelligent AI." 2026.
  • CNN. "Gen Z Is Outsourcing Hard Conversations to AI." March 7, 2026.
  • Arrows Survey. "AI Dating Tools Usage Among 1,008 U.S. Singles." 2026.
  • Elon University. "Building Human Resilience for the Age of AI." 386 expert respondents, April 2026.

Stuck on a reply right now?

Upload your screenshot. Get 3 options. Pick one and send.

Try Vervo free