Researchers from Japan’s Nara Institute of Science and Technology and Osaka University have developed an AI model that simulates how humans form and experience emotions. The system integrates physiological signals, visual cues, and verbal descriptions to recognize emotional states without being pre-labeled.
In the study, 29 participants viewed 60 emotion-evoking images while wearable sensors recorded heart rate and other bodily responses, alongside verbal descriptions of their feelings. The AI model correctly predicted participants’ self-reported emotions about 75% of the time, demonstrating an ability to capture patterns that resemble human perception.
This research could help AI systems respond more naturally and empathetically in daily interactions. Applications include mental health support, assistive technologies for developmental disorders or dementia, and smart devices that adjust to human moods. By combining body signals, sensory input, and language, AI can better understand emotions, providing guidance, support, and more human-like interactions.