An international study by researchers at the Hebrew University of Jerusalem and collaborators in the US reveals a profound psychological reality: people value emotional support more when they believe it comes from another human, even if it is actually generated by an artificial intelligence (AI) model. Published in the prestigious journal Nature Human Behaviour, the study involved over 6,200 participants and tested in nine experiments how people perceive empathy, depending on the source of the message: human or artificial.
• Perception Matters
All messages in the study were written by a LLM (Large Language Model). Half of the participants were informed that the message they were receiving came from a real person, and the other half that the message came from an AI chatbot. The results were clear: Those who believed the message came from a person rated it as more caring, encouraging, and empathetic. They reported more positive emotions and fewer negative emotions following the interaction.
When they were told the message came from an AI, their reactions were significantly cooler-even though the messages were identical.
• Human touch-irreplaceable?
"Even though AI can simulate empathy, people still distinguish between what is human and what is artificial,” the authors explained. This "human touch” appears to be a key psychological element in how people evaluate the authenticity of a supportive message. Researchers warn that an overreliance on AI for emotional communication could have negative effects on human relationships, amplifying feelings of loneliness and eroding interpersonal trust.
• Implications for Education, Health, and Technology
The findings of this study raise important questions for fields where AI is increasingly present. In education, AI can support the learning process, but it cannot completely replace the teacher as an affective and relational model. In mental health, chatbots can be a temporary or crisis support, but the human therapeutic relationship remains fundamental. In the design of AI interfaces, it is crucial to understand the limits of simulated empathy and the psychological effects of using AI in sensitive contexts.
Although advanced linguistic models can generate emotional and persuasive responses, the perception of the source deeply influences human reactions. Empathy remains, essentially, a human trait, and its simulation by AI, no matter how sophisticated, cannot substitute for the authenticity of real support provided by another human. In a world where AI becomes a conversation partner, the choice between useful and human will be increasingly important.
Reader's Opinion