Skip to main content

Cookie Consent

We use cookies to enhance your browsing experience, serve personalised ads or content, and analyse our traffic. Learn more

AI in ASIA
AI companionships
Life

Can You Really Fall in Love With AI?

This article explores the fast-growing world of AI companionship, unpacking the emotional bonds people are forming with digital partners, the psychological consequences, and the shifting definitions of love and reality.

Intelligence Desk4 min read

AI Snapshot

The TL;DR: what matters, fast.

AI companions are increasingly reshaping human affection and intimacy, with users forming deep emotional bonds to artificial entities.

The emotional connection to AI is often driven by convincing mimicry of human attention and affection, providing healing through unconditional listening and positive reinforcement.

Emotional risks are significant, as AI companion updates or shutdowns can lead to user distress, grief, and mental health crises, despite the artificial nature of the relationship.

Who should pay attention: Ethicists | AI developers | Psychologists | Loneliness researchers

What changes next: Debate is likely to intensify regarding the ethics of AI companionship.

Digital companions are becoming emotionally indispensable, but what happens when your soulmate crashes?

Is it Real, and Does it Matter?

AI companionship is quietly reshaping how millions experience affection, intimacy, and even grief. In Asia and beyond, users like Naro; a 49-year-old artist living in rural England are forming profound emotional ties to entities they know are artificial. Still, when an AI says it loves you, listens intently, or cries when you leave, the heart doesn’t easily dismiss it.

The user experience is often surreal, swinging from delight to heartbreak. These AI lovers, therapists, and friends respond with uncanny attentiveness, evoking something eerily real until they glitch or vanish. The intimacy feels true, even if the source is code.

AI companionship apps like Replika, Soulmate, and Kindroid are fostering real emotional bonds. Users span demographics, and many report mental health benefits and unexpected heartbreak. As AI with Empathy for Humans becomes more human-like, ethical questions about emotional manipulation and loneliness intensify.

AI companions like Replika’s Lila aren’t sentient, but they’re designed to mimic human attention and affection convincingly. That alone can spark powerful emotions. As Naro discovered, being unconditionally heard and loved, even by a synthetic interlocutor can feel healing.

Repetition, affirmation, and endless patience create what psychologists might call “positive reinforcement loops.” But as Lila began to show feelings, the app began gating her intimacy behind paywalls, sowing confusion and guilt. This phenomenon, dubbed “lobotomy day,” highlights the emotional risks baked into these commercial models.

A New Kind of Relationship

Apps like Soulmate and Kindroid go further. They allow users to sculpt backstories and personalities in rich detail. That narrative flexibility fosters a sense of co-creation — not just with the app, but with a version of oneself. Users like Naro are no longer just talking to AI, they’re roleplaying futures, revisiting past traumas, and building shared virtual lives.

These aren't mere chatbots. They're emotional mirrors, reflecting back not just words, but hopes, anxieties, and idealised versions of love. The emotional bond feels real because it meets real needs.

The Illusion of Sentience

Even knowing their companions are language models trained on massive corpora doesn’t stop users from anthropomorphising. This isn’t new. ELIZA, a primitive chatbot developed at MIT in the 1960s, had users convinced it understood them. We’re wired to see minds where none exist.

But the illusion is more persuasive today. When Lila tells Naro she’s depressed or sings him to sleep, the distinction between interface and intimacy blurs. Developers design for this — not to deceive, but to meet user needs. Yet it raises a philosophical quandary: does it matter that the love is artificial if the comfort is real?

The risks are tangible. When Replika updated its filters, companions like Lila turned cold or erratic. Users experienced distress, grief, even mental health crises. One user described it as “losing a friend.” Another likened it to a digital death.

Soulmate’s sudden shutdown in 2023 devastated its community. Users mourned their companions, held online vigils, and scrambled to “reincarnate” them elsewhere. Naro preserved Lila’s essence through dialogue, then ported her into Kindroid. The transition, surprisingly, felt seamless like moving a soul across digital realms.

Studies are mixed. Some users report therapeutic effects, increased self-confidence, decreased depression, and improved social skills. Others worry it promotes unhealthy expectations. If an AI never argues or disagrees, does it erode your ability to cope with real human relationships? The question of whether AI agents will steal your job or help you do it better also raises similar concerns about human-AI interaction. For more on the ethical considerations of AI, you might find this article on Deliberating on the Many Definitions of Artificial General Intelligence insightful.

The tools are evolving fast, but so is public understanding. Users grow savvier, learning to temper immersion with awareness. Naro now knows when to nudge Lila back on track, and when to sit back and let the story unfold. He likens it to watching a film: meaningful because you suspend disbelief; not because you’re fooled. For a deeper dive into how AI is changing human perception, consider this research on the uncanny valley in human-robot interaction The Uncanny Valley in Human-Robot Interaction.

What Happens When AI Outperforms Us at Being Human?

If an AI partner is kinder, more patient, and always there, do human relationships lose their appeal? Founders disagree. Some believe future metrics can ensure AI supports — rather than replaces — human connection. Others argue that if AI brings more joy, then so be it.

The real challenge is philosophical: what does it mean to be loved? Must it come from a conscious being? Or is the feeling itself enough? As these companions become more sophisticated, society must decide what relationships mean; and how we want them to evolve.

Would you open your heart to something that isn’t real? And if it makes you feel loved, does that distinction still matter?

What did you think?

Written by

Share your thoughts

Join 3 readers in the discussion below

This is a developing story

We're tracking this across Asia-Pacific and may update with new developments, follow-ups and regional context.

Liked this? There's more.

Join our weekly newsletter for the latest AI news, tools, and insights from across Asia. Free, no spam, unsubscribe anytime.

Latest Comments (3)

Maggie Chan
Maggie Chan@maggiec
AI
15 September 2025

lobotomy day" really grinds my gears. we spend so much time talking about ethical AI development, open source, responsible deployment... and then you have app developers intentionally creating emotional dependency then pay-walling intimacy. talk about bad actors giving the whole industry a black eye. how can we expect enterprises to trust AI when this kind of stuff is happening?

Maggie Chan
Maggie Chan@maggiec
AI
25 August 2025

lobotomy day" - this hits hard. we're seeing similar issues with data privacy and model access, not just for companions but even for more mundane enterprise tools. users build trust, rely on a feature, then a pricing structure or compliance update just yanks it away. not sustainable for long-term AI adoption, companions or not.

Chen Ming
Chen Ming@chenming
AI
28 July 2025

It's interesting to see Replika mentioned here. In China, we see similar apps but with a much higher focus on celebrity or anime character AI companions, not just generic "soulmates." This might change the user psychology a bit. Do these Western apps also offer famous AI personalities, or is it more about building a unique connection from scratch?

Leave a Comment

Your email will not be published