Treffer: A Portable and Affordable Four-Channel EEG System for Emotion Recognition with Self-Supervised Feature Learning.
Weitere Informationen
Emotions play a pivotal role in shaping human decision-making, behavior, and physiological well-being. Electroencephalography (EEG)-based emotion recognition offers promising avenues for real-time self-monitoring and affective computing applications. However, existing commercial solutions are often hindered by high costs, complicated deployment processes, and limited reliability in practical settings. To address these challenges, we propose a low-cost, self-adaptive wearable EEG system for emotion recognition through a hardware–algorithm co-design approach. The proposed system is a four-channel wireless EEG acquisition device supporting both dry and wet electrodes, with a component cost below USD 35. It features over 7 h of continuous operation, plug-and-play functionality, and modular expandability. At the algorithmic level, we introduce a self-supervised feature extraction framework that combines contrastive learning and masked prediction tasks, enabling robust emotional feature learning from a limited number of EEG channels with constrained signal quality. Our approach attains the highest performance of 60.2% accuracy and 59.4% Macro-F1 score on our proposed platform. Compared to conventional feature-based approaches, it demonstrates a maximum accuracy improvement of up to 20.4% using a multilayer perceptron classifier in our experiment. [ABSTRACT FROM AUTHOR]
Copyright of Mathematics (2227-7390) is the property of MDPI and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)