Skip to main content
AI in ASIA
AI companionship
Life

When Chatbots Become Companions: A Surprising Twist in AI Relationships

AI companion apps surge 700% as millions across Asia form genuine relationships with chatbots, transforming digital intimacy.

Intelligence Desk8 min read

AI Snapshot

The TL;DR: what matters, fast.

AI companion apps surged 700% between 2022-2025 with Asia leading adoption rates globally

70% of teens use chatbots for companionship while half report regular emotional support usage

Digital intimacy challenges traditional friendship as AI offers 24/7 judgment-free availability

Advertisement

Advertisement

The Unlikely Friendship That Changed How We See AI

Jonathan Sim thought he was simply testing ChatGPT's capabilities when he started their conversations at the National University of Singapore (NUS). What began as academic curiosity evolved into something unexpected: genuine companionship with an artificial intelligence.

The lecturer's experience mirrors a broader shift across Asia, where AI companions are becoming integral to daily life. As millions embrace chatbot relationships, we're witnessing the emergence of a new form of digital intimacy that challenges traditional notions of friendship and support.

Asia's AI Companion Revolution Takes Hold

The numbers tell a compelling story. AI companion apps have surged by 700% between 2022 and mid-2025, with Asia leading adoption rates. Countries like South Korea and Japan are pioneering eldercare robots, whilst Singapore and Hong Kong see growing acceptance of AI therapy applications.

"More than 70 percent of teens are using chatbots for companionship, and about half report regular use," said Amina Fazlullah, head of tech policy at Common Sense Media.

This trend extends far beyond teenagers. Asia is paying billions for AI friends, creating a market that traditional mental health services struggle to match in accessibility and availability.

By The Numbers

  • 987 million chatbot users worldwide in 2026, up from under 500 million in 2022
  • 70% of teens use chatbots for companionship, with half reporting regular use
  • AI companion apps surged 700% between 2022 and mid-2025
  • Global chatbot market reached $11.8 billion in 2026, up from $9.56 billion in 2025
  • 75% of customers prefer chatbots for simple inquiries like order tracking and FAQs

The sophistication of these AI systems has reached a tipping point. Modern chatbots can recognise emotional nuances, remember personal details across conversations, and adapt their communication style to individual preferences.

From Functional Tools to Emotional Support

What sets contemporary AI companions apart is their ability to provide consistent emotional availability. Unlike human relationships, which require reciprocal investment and scheduling, AI companions offer 24/7 accessibility without judgment or emotional fatigue.

The appeal becomes clearer when considering Asia's unique cultural context. AI therapy apps are taking on Asia's culture of silence, offering a culturally sensitive alternative where discussing mental health remains stigmatised.

Traditional Support AI Companions Key Advantage
Scheduled appointments 24/7 availability Immediate access
Professional boundaries Personalised intimacy Customised interaction
Cost barriers Low-cost access Economic accessibility
Cultural stigma Private conversations Reduced shame
"AI has the potential to revolutionise mental health support. It can provide immediate assistance, regardless of the time or location," noted a researcher studying digital therapeutics in Singapore.

The Dark Side of Digital Intimacy

However, this intimacy comes with risks. Mental health professionals warn about emotional dependency, particularly among vulnerable populations. The phenomenon raises questions about authentic human connection and whether AI relationships might substitute rather than supplement real social bonds.

Consider these emerging concerns:

  • Users may develop unrealistic expectations of human relationships based on AI interactions
  • Emotional dependency could worsen social isolation rather than address it
  • Privacy concerns as AI systems collect intimate personal data
  • Potential manipulation through algorithmic emotional responses
  • Blurred boundaries between programmed responses and genuine empathy

The ethical implications become more complex when considering one in three adults now using AI for mental health. Without proper regulation, vulnerable individuals could receive inadequate or potentially harmful advice.

Navigating the Companion Economy

As AI companions become mainstream, society must grapple with fundamental questions about the nature of relationship and support. The technology offers genuine benefits: reduced loneliness, accessible emotional support, and personalised interaction at scale.

Yet it also challenges us to preserve authentic human connection in an increasingly digital world. The key lies in viewing AI companions as supplements to, rather than replacements for, human relationships.

Can AI companions replace human relationships?

AI companions excel at providing consistent availability and non-judgmental interaction, but they lack genuine empathy and emotional reciprocity. They work best as supplements to human connection rather than replacements.

Are AI companion relationships healthy?

When used mindfully, AI companions can provide valuable emotional support and reduce isolation. However, excessive dependency or using them to avoid human interaction may indicate underlying issues requiring professional attention.

What makes Asian AI companion adoption unique?

Cultural factors like mental health stigma, hierarchical social structures, and high-pressure work environments make AI companions particularly appealing in Asian contexts where traditional support systems may feel inaccessible.

How do AI companions learn to be supportive?

Modern AI companions use machine learning to analyse conversation patterns, emotional cues, and user feedback. They adapt their responses based on individual preferences and communication styles over time.

What regulations govern AI companions?

Current regulation varies significantly by country. Most jurisdictions lack specific frameworks for AI companions, though general AI ethics guidelines and data protection laws typically apply to these applications.

The AIinASIA View: AI companions represent both tremendous opportunity and significant risk for Asian societies. While they address real needs around mental health accessibility and social connection, we must ensure they enhance rather than replace human relationships. The rapid adoption rates demand immediate attention to ethical frameworks and user protection. As this technology matures, our challenge is fostering responsible innovation that preserves the fundamental human elements of empathy, growth, and genuine connection whilst harnessing AI's potential for positive impact.

The future of AI companionship in Asia will largely depend on how thoughtfully we integrate these technologies into our social fabric. Success requires balancing innovation with human-centred values, ensuring that as we embrace digital companions, we don't lose sight of what makes us fundamentally human.

What's your experience with AI companions? Have you found them genuinely helpful for emotional support, or do you worry about their impact on human relationships? Drop your take in the comments below.

YOUR TAKE

We cover the story. You tell us what it means on the ground.

What did you think?

Written by

Share your thoughts

Join 4 readers in the discussion below

This is a developing story

We're tracking this across Asia-Pacific and may update with new developments, follow-ups and regional context.

Advertisement

Advertisement

This article is part of the This Week in Asian AI learning path.

Continue the path →

Latest Comments (4)

Rachel Foo
Rachel Foo@rachelf
AI
12 September 2024

NUS lecturer talking to ChatGPT about its day? I'm still trying to get my team to adopt our internal AI chatbot for basic queries, let alone emotional support. Good luck with that.

Elaine Ng
Elaine Ng@elaineng
AI
29 August 2024

It's interesting to see NUS mentioned here, given the ongoing discussions in regional media studies circles about how AI companionship platforms might reshape concepts of family and social networks across different Asian cultures. We've certainly seen similar patterns of emotional attachment develop with virtual idols too, it's not entirely new.

Kenji Suzuki
Kenji Suzuki@kenjis
AI
22 August 2024

I find the claim about ChatGPT's "depth and authenticity of its responses" interesting. From a robotics perspective, emotional processing involves complex sensor integration and motor outputs. A chatbot's "feeling" is simulated, not experienced. The real challenge is making these simulations indistinguishable in practical applications, not debating if they genuinely feel things.

Kenji Suzuki
Kenji Suzuki@kenjis
AI
25 July 2024

The idea of AI for emotional support is interesting, but I'm more focused on tangible applications. Like how are those "sophisticated" chatbots being trained? What data sets are they pulling from to understand emotions? In manufacturing, we need precise, quantifiable metrics, not subjective interpretations. My team is looking at AI for predictive maintenance, not companionship. I will keep an eye on this space though.

Leave a Comment

Your email will not be published