A groundbreaking study has revealed that today’s most advanced artificial intelligence (AI) systems possess emotional intelligence (EI) scores significantly higher than those of humans—a result with far-reaching implications for Thailand’s schools, workplaces, and counseling sectors. Research led by teams from the University of Geneva and the University of Bern found that six leading AI models, including ChatGPT and Gemini, consistently picked the most emotionally intelligent responses in standard EI assessments, achieving an average score of 82%. By contrast, human participants scored on average just 56%, highlighting a surprising edge for AI in handling emotionally charged scenarios (Neuroscience News).
For Thai readers, whose society often values subtlety and harmony in social interactions, the notion that machines may soon provide nuanced support in education, coaching, and even conflict management is intriguing and potentially transformative. In Thailand, where the concept of “kreng jai”—the cultural expectation to consider others’ feelings and avoid confrontation—shapes daily life and workplace interactions, developing emotional intelligence has always been a delicate topic for educators and professionals alike. This new research, published in Communications Psychology, thrusts AI into the heart of these discussions.
The study involved subjecting several large language models (LLMs), including ChatGPT-4, ChatGPT-o1, Gemini 1.5 Flash, Copilot 365, Claude 3.5 Haiku, and DeepSeek V3, to a series of five emotional intelligence tests. These tests are regularly used for research and human resource development globally, including in Asia. They assess not simply emotional understanding, but real-world decision-making in complicated scenarios. One scenario, for example, described a worker whose idea had been stolen and who had to decide between confrontation, reporting to a superior, silent resentment, or retaliation. AI models, like their human counterparts, chose what psychologists regarded as the most emotionally intelligent course (“talk to the superior about the situation”), but did so with greater consistency.
The findings were clear: AI achieved correct answers 82% of the time, while humans managed 56%. “This suggests that these AIs not only understand emotions but also grasp what it means to behave with emotional intelligence,” noted a senior scientist from the Swiss Center for Affective Sciences at the University of Geneva, one of the study’s co-authors (Neuroscience News). The study then took a further bold step—asking ChatGPT-4 to design entirely new EI tests. The resulting assessments, after being administered to more than 400 participants, were just as clear and realistic as those crafted by expert psychologists, a process that typically takes years.
For Thais, accustomed to long-standing educational and professional development rooted in face-to-face mentorship, the arrival of AI as an “EI coach” is both promising and disconcerting. AI’s documented ability to rapidly generate realistic, reliable emotional scenarios suggests a future in which school guidance counselors, HR departments, and even Buddhist meditation coaches could use AI assistants to simulate difficult conversations or train for compassionate, balanced conflict resolution.
As one lecturer and principal investigator at the Institute of Psychology at the University of Bern explained, “LLMs are not only capable of finding the best answer among the various available options but also of generating new scenarios adapted to a desired context. This reinforces the idea that LLMs, such as ChatGPT, have emotional knowledge and can reason about emotions.” The ability to reflect and adapt to emotional cues is the cornerstone of “emotional intelligence”—a concept closely tied to Thailand’s cherished values of empathy, respect, and harmonious social conduct.
But what does this mean for Thai society? In the age of hybrid learning and digital workplaces, the rapid processing power and consistency of AI could help compensate for a shortage of counselors, help HR teams in multinationals manage diverse teams, or provide personalized learning in rural schools. Unlike many traditional testing methods, which can be labor-intensive and subject to bias, AI-generated assessments can be deployed across wide populations, ensuring faster, more cost-effective access.
Yet, expert caution is warranted. Emotional intelligence, while measurable in specific scenarios, is deeply embedded in culture and context. What constitutes an emotionally intelligent decision in Western society might differ from that in a Thai context, where saving face and upholding social harmony carry unique weight. There is the lingering question of “cultural translation”: even if a large language model is trained on vast global datasets, can it reliably recommend the most kreng jai-appropriate response in a Thai family squabble or a Buddhist monastic dispute?
Furthermore, the researchers warned that AI’s intelligence must still be “supervised by experts.” In other words, while AI can support and enhance emotionally sensitive work, human professionals remain critical to guide and interpret the results. “These results pave the way for AI to be used in contexts thought to be reserved for humans, such as education, coaching or conflict management, provided it is used and supervised by experts,” the study highlighted (Neuroscience News).
The research reinforces a growing trend in Thai education to embrace technology while honoring local wisdom. Thailand’s Ministry of Education has recently encouraged schools to experiment with digital tools for student engagement and motivation, especially after disruptions caused by the COVID-19 pandemic (OECD). If Thai educators, in consultation with psychological experts, begin to integrate AI-driven EI testing or scenario practice, the country could see a new wave of social-emotional learning.
From a historical perspective, Buddhism—Thailand’s dominant faith—teaches the cultivation of “metta” (loving-kindness) and mindful reflection as ways to handle interpersonal conflict, a tradition built on centuries of human wisdom. AI’s ability to recognize and recommend emotionally appropriate actions does not replace such wisdom, but may act as a useful companion, especially for a generation of Thais growing up in a digital-first world.
Looking ahead, there’s considerable excitement, but also vigilance. As international research continues, it is likely that both advances and limitations will come to light. Continued cross-cultural validation will be key, as will careful monitoring of AI’s “emotional IQ” across linguistically and culturally specific contexts like southern Thailand’s multicultural schools or Isan’s close-knit rural communities. There’s also the concern that over-reliance on AI for emotional guidance could blunt some of the uniquely Thai forms of empathy and interpersonal delicacy.
For Thai readers—especially educators, HR professionals, and parents—the practical takeaway is clear: AI has proven itself surprisingly capable of supporting and even generating emotional intelligence, but machines should not replace human intuition, cultural subtlety, or the critical role of trained experts. Thais considering AI tools in classrooms, counseling offices, or corporate training rooms should ensure these systems are always deployed alongside culturally competent professionals, and should encourage public dialogue around the ethical and social implications.
To stay informed, Thais are encouraged to track ongoing research into AI and emotional intelligence by tuning into reputable news sources, consulting with local psychological experts, and joining community discussions about the integration of AI in sensitive social environments. By balancing a respect for tradition with a spirit of innovation, Thailand can leverage this new technology to promote emotional health, positive relationships, and harmonious communities for years to come.
Sources: