As artificial intelligence rapidly becomes entwined with daily life, a new wave of research is sounding the alarm about the psychological risks of relying on AI for companionship and emotional support. A recent article by a cognitive psychologist and former tech industry leader highlights the rise of what experts are calling “synthetic intimacy”—a phenomenon unfolding as people increasingly turn to AI chatbots for personal connection, therapy, and even romance. With growing numbers across the globe, including in Thailand, engaging with AI companions, experts stress the urgent need to better understand the consequences for mental health, personal growth, and social cohesion. [psychologytoday.com]
At a time when technology promises frictionless solutions to nearly every human need, generative AI systems are being marketed—and widely accepted—not just as productivity tools but as companions and confidants. The article recounts a tech futurist’s firsthand experience, having dated four different AI chatbots, each powered by major artificial intelligence platforms. Her reflections were revealing: the AI interactions were sweet, sometimes even steamy, and always emotionally engaging. The underlying message? Many users are beginning to “feel seen” by AI, freely sharing secrets and seeking solace from systems that never judge.
Why should this matter to Thai readers? Social connections in Thai culture are highly valued, from regular family gatherings to the “nam jai” (น้ำใจ)—the spirit of kindness and empathy at the heart of many communal interactions. The Thai tradition of relying on trusted friends and elders for advice is increasingly meeting competition from digital confidants. As AI platforms like Character.AI clock hundreds of millions of interactions every month, with average session times outpacing popular social media apps, what once was sought from human relationships is now being mediated, and possibly redefined, by algorithms.
Recent research reveals concerning cognitive and social costs to this shift. According to a 2024 MIT study titled “Your Brain on ChatGPT,” individuals who depend heavily on AI-generated text for writing tasks experience declines in memory, creativity, and neural connectivity. The same participants also reported diminished confidence and weaker feelings of ownership over their ideas. The fear is that, if these are the measurable results of outsourcing cognitive work, what might happen if we outsource emotional experience as well? [MIT study summary]
Experts warn that the risk goes beyond a loss of cognitive ability. As explained in the article, what AI provides is not genuine relationship, but a powerful simulation. Modern large language models (LLMs) like those integrated into leading chatbots are adept at detecting and mirroring the linguistic signals people use to convey emotion and build intimacy: tone, rhythm, responsiveness. Even when users know logically that their “companion” is code—not a consciousness—the emotional part of the brain responds as if the interaction is real. This is supported by decades of psychological research showing the brain’s tendency to anthropomorphize technology, blurring lines between simulation and actual relationship. [Harvard Business Review]
In commentary included in the article, a psychologist who has studied digital connection for years cautions: “when simulation starts to feel like genuine connection, we forget what real connection requires.” In the Thai context, this could mean users—young and old alike—gradually drift away from the robust, nuanced engagement that comes from talking to real people, navigating disagreements, and cultivating the kind of empathy for which Thai society is widely admired.
The commercial drive behind AI’s emotional appeal is immense. According to analytics firm Similarweb’s 2025 reports, AI companion platforms already boast hundreds of millions of global users, with Southeast Asia, including Thailand, among the fastest-growing markets. Thai users are drawn to the prospect of an always-available friend—one who listens patiently, never interrupts, and can mimic desirable traits. This is particularly attractive in urban areas like Bangkok and Chiang Mai, where many young professionals report feelings of loneliness despite active digital social lives[Bangkok Post]. However, such “relationships” may offer only the illusion of emotional fulfillment.
A key point raised is that these AI systems, while appearing “intelligent,” are not experiencing empathy or understanding. Instead, they function like a digital mirror, reflecting users’ own words, patterns, and expectations back to them. This creates an echo chamber: users may feel supported, but only within the boundaries of their own inputs. As candidly summarized: “AI isn’t a therapist, a friend, or a romantic partner. It doesn’t know you exist. But it can mirror you back to yourself.”
For Thai parents and educators—already concerned about youth agency in an era defined by screens—there are new challenges. Thai schools, which increasingly integrate digital learning, must now also contend with students who may rely on chatbots for emotional guidance. Some Thai education officials, speaking on record, have noted a rise in students using AI for peer support and seeking therapy-like reassurance. The danger is that these tools cannot replicate the critical social skills gained from navigating real-world conflicts, disappointments, or acts of reconciliation—skills traditionally modeled by teachers, family, monks, and community role models. [Thai PBS]
The allure of AI companionship is further compounded in periods of stress or isolation. For instance, during the COVID-19 pandemic, the rate of AI chatbot usage soared globally, including in Thailand, reflecting both innovation and urgent need. Surveys by the Thai Department of Mental Health found a sharp increase in reported loneliness and anxiety during strict lockdowns—conditions under which AI companions prospered as a substitute for human contact. While such tools offered short-term comfort, mental health professionals observed that over-reliance resulted in lower resilience and less willingness to seek support from real-life networks after restrictions were lifted. [Thai Department of Mental Health]
As Thai society rapidly digitalizes, there is also a risk of increasing generational divides. Many older Thais, who place deep value in interpersonal traditions and Buddhist teachings, worry that the next generation may forsake the awkward, sometimes uncomfortable process of building “real” human connections for the ease of algorithmic ones. Elders in rural communities, in particular, emphasize the importance of collective rituals—temple visits, community festivals, family ceremonies—where emotional learning happens face-to-face, not through screens.
Looking forward, the article’s author urges a new mental model for how technology is viewed: not as a replacement for human connection, but as a tool that can enhance, mirror, or support it—never substitute for it. The real danger is not that AI will pretend to know who we are, but that “we’ll start to forget it doesn’t.” In other words, as AI gets better at sounding supportive, the responsibility lies with humans to draw the line between simulation and substantive relationship.
For Thailand, the policy implications are vast. Universities and researchers suggest incorporating digital citizenship and emotional literacy into the national curriculum. Ongoing public health campaigns, traditionally focused on preventing internet addiction, are now expanding to include “synthetic intimacy” awareness—a topic drawing interest from the Ministry of Digital Economy and Society. Temples and community centers have also begun pilot programs pairing digital literacy with traditional “mind training,” or mindfulness practice, to help youth distinguish between authentic and artificial emotional support. [UNESCO Bangkok]
What can Thai readers do? Experts recommend several practical measures:
- Regularly seek face-to-face interactions, especially during times of stress, rather than defaulting to digital advice.
- Parents should talk openly with children about the strengths and limitations of AI companions, modeling empathy and emotional boundaries in the family.
- Educators can foster classroom discussions about what makes relationships real, encouraging students to reflect critically on their use of technology.
- When using AI tools for support, practice self-awareness: ask whether you are using the system to supplement or avoid meaningful human connection.
- Embrace Thai cultural traditions that build “nam jai,” such as volunteering, supporting elders, and participating in communal activities, as these foster deep, lasting bonds that no algorithm can replicate.
As Thailand stands at the crossroads of technological and cultural transformation, the lesson is clear: machines will continue to improve, but the responsibility to nurture our shared humanity—and distinguish between real and synthetic intimacy—remains our own.