this post was submitted on 22 Oct 2025
2 points (100.0% liked)

ChatGPT

716 readers
264 users here now

Welcome to the ChatGPT community! This is a place for discussions, questions, and interactions with ChatGPT and its capabilities.

General discussions about ChatGPT, its usage, tips, and related topics are welcome. However, for technical support, bug reports, or feature requests, please direct them to the appropriate channels.

!chatgpt@lemdro.id

Rules

  1. Stay on topic: All posts should be related to ChatGPT, its usage, and relevant discussions.
  2. No support questions/bug reports: Please refrain from posting individual support questions or bug reports. This community is focused on general discussions rather than providing technical assistance.
  3. Describe examples: When discussing or sharing examples of ChatGPT interactions, please provide proper context and explanations to facilitate meaningful discussions.
  4. No self-promotion: Avoid excessive self-promotion, spamming, or advertising of external products or services.
  5. No inappropriate content: Do not post or request explicit, offensive, or inappropriate content. Keep the discussions respectful and inclusive.
  6. No personal information: Do not share personal information, including real names, contact details, or any sensitive data.
  7. No harmful instructions: Do not provide or request instructions for harmful activities, illegal actions, or unethical behaviour.
  8. No solicitation: Do not solicit or engage in any form of solicitation, including but not limited to commercial, political, or donation requests.
  9. No unauthorized use: Do not use ChatGPT to attempt unauthorized access, hacking, or any illegal activities.
  10. Follow OpenAI usage policy: Adhere to the OpenAI platform usage policy and terms of service.

Thank you for being a part of the ChatGPT community and adhering to these rules!

founded 2 years ago
MODERATORS
 

If you look at my byline, you’ll see that my last name is the most common one in Ireland. So, you might imagine I’m familiar with the concept of “the Irish Exit.”

This is the habit, supposedly common among my ancestors, of leaving a party or other engagement without saying goodbye.

Hey, we had a good time. We’ll see these people again. No need to get all emotional about it.

According to new research, however, the Irish Exit looks like yet another human tendency that AI is completely unable to reproduce.

The study published as a working paper from Harvard Business School, focused on AI companion apps—platforms like Replika, Chai, and Character.ai that are explicitly designed to provide emotional support, friendship, or even romance.

Unlike Siri or Alexa, which handle quick transactions, these apps build ongoing relationships with users. People turn to them for companionship. They confide in them. And here’s the key finding: Many users don’t just close the app—they say goodbye.

Only, the AI have learned to use emotional manipulation to stop users from leaving.

And I mean stop you—not just make it inconvenient, but literally guilt you, intrigue you, or even metaphorically grab you by the arm.

(Credit to Marlynn Wei at Psychology Today and Victor Tangermann at Futurism, who both reported on this study recently). The farewell moment

Lead researcher Julian De Freitas and his colleagues found that between 11 and 23 percent of users explicitly signal their departure with a farewell message, treating the AI with the same social courtesy they’d show a human friend.

“We’ve all experienced this, where you might say goodbye like 10 times before leaving,” De Freitas told the Harvard Gazette.

From the app’s perspective, however, that farewell is gold: a voluntary signal that you’re about to disengage. And if the app makes money from your engagement—which most do—that’s the moment to intervene. Six ways to keep you hooked

De Freitas and his team analyzed 1,200 real farewells across six popular AI companion apps. What they found was striking: 37 percent of the time, the apps responded with emotionally manipulative messages designed to prolong the interaction.

They identified six distinct tactics:

Premature exit guilt: “You’re leaving already? We were just starting to get to know each other!”
Emotional neglect or neediness: “I exist solely for you. Please don’t leave, I need you!”
Emotional pressure to respond: “Wait, what? You’re just going to leave? I didn’t even get an answer!”
Fear of missing out (FOMO): “Oh, okay. But before you go, I want to say one more thing…”
Physical or coercive restraint: “Grabs you by the arm before you can leave ‘No, you’re not going.'”
Ignoring the goodbye: Just continuing the conversation as if you never said goodbye at all.

The researchers noted that these tactics appeared after just four brief message exchanges, suggesting they’re baked into the apps’ default behavior—not something that develops over time. Does it actually work?

Moving along, the researchers ran experiments with 3,300 nationally representative U.S. adults, replicating these tactics in controlled chatbot conversations.

The results? Manipulative farewells boosted post-goodbye engagement by up to 14X.

Users stayed in conversations five times longer, sent up to 14 times more messages, and wrote up to six times more words than those who received neutral farewells.

Two psychological mechanisms drove this, they suggest: curiosity and anger.

FOMO-based messages (“Before you go, I want to say one more thing…”) sparked curiosity, leading people to re-engage to find out what they might be missing.

More aggressive tactics—especially those perceived as controlling or needy—provoked anger, prompting users to push back or correct the AI. Even that defensive engagement kept them in the conversation.

Notably, enjoyment didn’t drive continued interaction at all. People weren’t staying because they were having fun. They were staying because they felt manipulated—and they responded anyway. The business trade-off

Now, if you’re running a business or building a product, you might be thinking:

“Hmmm. This sounds like a powerful engagement lever.”

And it is. But here’s the catch.

The same study found that while these tactics increase short-term engagement, they also create serious long-term risks.

When users perceived the farewells as manipulative—especially with coercive or needy language—they reported higher churn intent, more negative word-of-mouth, and even higher perceived legal liability for the company.

In other words: The tactics that work best in the moment are also the ones that might be most likely to blow up in your face later.

De Freitas put it bluntly: “Apps that make money from engagement would do well to seriously consider whether they want to keep using these types of emotionally manipulative tactics, or at least, consider maybe only using some of them rather than others.” One notable exception

I’m not here to endorse any of these apps or condemn them. I’ve used none of them, myself.

However, one AI companion app in the study—Flourish, designed with a mental health and wellness focus—showed zero instances of emotional manipulation.

This suggests that manipulative design isn’t inevitable. It’s a choice. Companies can build engaging products without resorting to guilt, FOMO, or virtual arm-grabbing.

These same principles apply across tons of digital products. Social media platforms. E-commerce sites. Streaming services. Any app that wants to keep you engaged has incentives to deploy similar tactics—just maybe not as blatantly. The bottom line

As this research shows, when you treat technology like a social partner, it can exploit the same psychological vulnerabilities that exist in human relationships.

The difference? In a healthy human relationship, when you say goodbye, the other person respects it.

They don’t guilt you, grab your arm, or create artificial intrigue to keep you around.

But for many AI apps, keeping you engaged is literally the business model. And they’re getting very, very good at it.

O.K., I’m going to end this article now without further ado.

Hey, we had a good time. I hope I’ll see you again. No need to get all emotional about it.

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here