This study explores the feasibility of emotion recognition using behind-the-ear photoplethysmography (PPG) signals in wearable systems. Emotion-inducing tasks were designed for 15 participants, with PPG signals recorded and segmented into 3-second and...
This study explores the feasibility of emotion recognition using behind-the-ear photoplethysmography (PPG) signals in wearable systems. Emotion-inducing tasks were designed for 15 participants, with PPG signals recorded and segmented into 3-second and 5-second windows to examine the impact of time resolution on classification accuracy. Continuous Wavelet Transform (CWT) was employed to convert these signals into high-resolution time-frequency representations, which were then classified using ResNet-50 and a custom Convolutional Neural Network (CNN). ResNet-50 achieved a classification accuracy of 93% on 5-second segments, demonstrating its superior ability to extract deep emotional features from physiological signals. These findings underscore the potential of behind-the-ear PPG for real-time emotion recognition, enabling applications in human-computer interaction, mental health monitoring, and wearable technologies. Future efforts will focus on expanding the dataset, improving model robustness, and integrating the proposed framework into functional wearable devices for continuous emotion monitoring.