Researchers in Japan have developed an AI framework designed to model how humans form emotional experiences by integrating bodily signals, sensory input and language. The work was led by scientists at Nara Institute of Science and Technology in collaboration with Osaka University.
The AI model draws on the theory of constructed emotion, which suggests emotions are built by the brain rather than hard-wired responses. Physiological data, visual cues and spoken descriptions were analysed together to replicate how people experience feelings in real situations.
Using unlabeled data from volunteers exposed to emotion-evoking images and videos, the system identified emotional patterns without predefined categories. Results showed about 75 percent alignment with participants’ own emotional assessments, well above chance levels.
The Japanese researchers say the approach could support emotion-aware AI applications in healthcare, robotics and mental health support. Findings were published in IEEE Transactions on Affective Computing, with potential benefits for understanding emotions that are difficult to express verbally.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!
