AI read emotions from heart rate, breathing, and sweat signals in conversations

No time to read?
Get a summary

Researchers Develop AI to Read Emotions Through Physiological Signals

Researchers from the University of Cincinnati have built an artificial intelligence system that infers a person’s emotional state during conversation by analyzing cues like heartbeat, breathing patterns, and perspiration. The project was detailed in the IEEE Transactions on Affective Computing journal.

The study aimed to deepen understanding of physiological synchronization, the natural alignment of breathing and heart rate that can occur between people during dialogue or shared activities. This alignment is often interpreted as a sign of mutual understanding or empathy.

The experimental setup included 36 participants who formed 18 pairs. In a series of interactive sessions, pairs engaged in conversations crafted to evoke a range of emotional responses. Throughout these interactions, researchers monitored biometric indicators to capture real-time physiological data.

Using the collected measurements, the team trained a neural network to categorize speech according to physiological cues. The model achieved an average accuracy of 75 percent in identifying emotional states across different conversational contexts.

These findings hold practical value for evaluating the outcomes of diverse communicative settings. They could inform studies on dating dynamics, counseling conversations, or any scenario where reading emotional tone matters.

A separate note mentions discussions around a hacker technique for stealing fingerprints via sounds produced by tapping on a screen. This remark sits outside the core findings but is referenced as part of broader conversations about biometric security and voice-body interactions.

No time to read?
Get a summary
Previous Article

Paris-China Talks Signal Focus on Peace Efforts in Ukraine

Next Article

Europe smartphone market 2023 Q4: Apple leads as premium demand rises