JIKA Lagen,DU Yuanhua,ZHOU Nan.Emotion Recognition-based Deep Electroencephalography Feedforward Attention Encoder Network Model[J].Journal of Chengdu University of Information Technology,2026,41(02):141-146.[doi:10.16836/j.cnki.jcuit.2026.02.001]
基于情绪识别的脑电图前馈注意力编码网络模型
- Title:
- Emotion Recognition-based Deep Electroencephalography Feedforward Attention Encoder Network Model
- 文章编号:
- 2096-1618(2026)02-0141-06
- Keywords:
- emotion recognition; multi-head attention network; deep learning; EEG
- 分类号:
- TP181
- 文献标志码:
- A
- 摘要:
- 神经网络模型在深度学习情绪识别领域展示了显著的潜力。脑电图(electroencephalography,EEG)信号因能够捕捉大脑活动的细微变化,从而反映个体的情绪状态,成为情绪识别的重要研究对象。提出一种基于EEG信号的深度脑电图前馈注意力编码(deep electroencephalography feedforward attention encoder,DEFAE)网络模型,用于情绪识别任务。DEFAE模型充分利用EEG信号中的时空特征,结合ResNet18残差网络与多头注意力网络,以捕捉情绪相关的关键信息。ResNet18网络有效解决了深层神经网络中梯度消失和网络退化的问题,而多头注意力机制的引入使得模型能够同时关注特征的不同部分,捕捉多种特征关系,提高模型的理解能力和鲁棒性。DEFAE模型在SEED_IV数据集上的实验结果显示其表现优异,情绪识别平均准确率达90.43%。与其他经典的神经网络模型相比,DEFAE模型在性能上也表现出一定的优势。
- Abstract:
- Neural network models have demonstrated significant potential in the field of deep learning for emotion recognition. Electroencephalography(EEG) signals, due to their ability to capture subtle variations in brain activity, reflect the emotional states of individuals and thus become an important research focus for emotion recognition. This study proposes a Deep Electroencephalography Feedforward Attention Encoder(DEFAE) network model for emotion recognition tasks based on EEG signals. The DEFAE model fully utilizes the spatiotemporal features of EEG signals, combining the ResNet18 residual network with a multi-head attention network to capture key information related to emotions. The ResNet18 network effectively addresses issues of gradient vanishing and network degradation in deep neural networks, while the introduction of the multi-head attention mechanism allows the model to focus on different parts of the features simultaneously, capturing multiple feature relationships and enhancing the model's understanding and robustness. Experimental results on the SEED_IV dataset demonstrate that the DEFAE model performs excellently, with an average emotion recognition accuracy of 90.43%. Compared to some other classical neural network models, the DEFAE model also shows certain advantages in performance.
参考文献/References:
[1] Zhong J,Chen T,Ro T,et al.Emotion recognition with audio,video,EEG,and EMG:A dataset and baseline approaches[J].IEEE Access,2022,10:13229-13242.
[2] Zheng W L,Zhu J Y,Lu B L.Identifying stable patterns over time for emotion recognition from EEG[J].IEEE Transactions on Affective Computing,2019,10(3):417-429.
[3] Maaoui C,Choubeila,Pruski A.Emotion recognition through physiological signals for human-machine communication[M].2010.DOI:10.5772/10312.
[4] Kosti R,Alvarez J M,Recasens A,et al.Emotion recognition in context[C].Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.2017:1667-1675.
[5] Li X,Zhang Y,Tiwari P,et al.EEG based emotion recognition:A tutorial and review[J].ACM Comput.Surv.,2022,55(4):Article 79. https://doi.org/10.1145/3524499.
[6] Alhagry S,Fahmy A A,El-Khoribi R A.Emotion recognition based on EEG using LSTM recurrent neural network[J].International Journal of Advanced Computer Science and Applications,2017,8(10).
[7] Nikolova D,Petkova P,Manolova A,et al.ECG-based emotion recognition:Overview of methods and applications[J].ANNA'18; Advances in Neural Networks and Applications,2018:1-5.
[8] Hasnul M A,Aziz N A A,Alelyani S,et al.Electrocardiogram-based emotion recognition systems and their applications in healthcare——a review[J].Sensors,2021,21(15):5015.
[9] Li W,Zhang Z,Song A.Physiological-signal-based emotion recognition:An odyssey from methodology to philosophy[J].Measurement,2021,172:108747.
[10] Soleymani M,Villaro-Dixon F,Pun T,et al.Toolbox for emotional feature extraction from physiological signals(TEAP)[J].Frontiers in ICT,2017,4:1.
[11] McCubbin J A,Merritt M M,Sollers III J J,et al.Cardiovascular-emotional dampening:The relationship between blood pressure and recognition of emotion[J].Psychosomatic Medicine,2011,73(9):743-750.
[12] Ko B C.A brief review of facial emotion recognition based on visual information[J].Sensors,2018,18(2):401.
[13] Yang D,Alsadoon A,Prasad P W C,et al.An emotion recognition model based on facial recognition in virtual learning environments[J].Procedia Computer Science,2018,125:2-10.
[14] Khaireddin Y,Chen Z.Facial emotion recognition:State of the art performance on FER2013[J].arXiv preprint arXiv:2105.03588,2021.
[15] Lim Y,Ng K W,Naveen P,et al.Emotion recognition by facial expression and voice:Review and analysis[J].Journal of Informatics and Web Engineering,2022,1(2):45-54.
[16] Han K,Yu D,Tashev I.Speech emotion recognition using deep neural network and extreme learning machine[C].Interspeech 2014,2014.
[17] Noroozi F,Corneanu C A,Kamińska D,et al.Survey on emotional body gesture recognition[J].IEEE Transactions on Affective Computing,2018,12(2):505-523.
[18] Tzirakis P,Trigeorgis G,Nicolaou M A,et al.End-to-end multimodal emotion recognition using deep neural networks[J].IEEE Journal of Selected Topics in Signal Processing,2017,11(8):1301-1309.
[19] Fan Y,Lu X,Li D,et al.Video-based emotion recognition using CNN-RNN and C3D hybrid networks[C].Proceedings of the 18th ACM International Conference on Multimodal Interaction.2016:445-450.
[20] Akhand M A H,Roy S,Siddique N,et al.Facial emotion recognition using transfer learning in the deep CNN[J].Electronics,2021,10(9):1036.
[21] Pitaloka D A,Wulandari A,Basaruddin T,et al.Enhancing CNN with preprocessing stage in automatic emotion recognition[J].Procedia Computer Science,2017,116:523-529.
[22] Zhang T,Zheng W,Cui Z,et al.Spatial-temporal recurrent neural network for emotion recognition[J].IEEE Transactions on Cybernetics,2018,49(3):839-847.
[23] Ebrahimi Kahou S,Michalski V,Konda K,et al.Recurrent neural networks for emotion recognition in video[C].Proceedings of the 2015 ACM on International Conference on Multimodal Interaction.2015:467-474.
[24] Ma J,Tang H,Zheng W L,et al.Emotion recognition using multimodal residual LSTM network[C].Proceedings of the 27th ACM International Conference on Multimedia.2019:176-183.
[25] Alhagry S,Fahmy A A,El-Khoribi R A.Emotion recognition based on EEG using LSTM recurrent neural network[J].International Journal of Advanced Computer Science and Applications,2017,8(10).
[26] He K,Zhang X,Ren S,et al.Deep residual learning for image recognition[C].Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.2016:770-778.
[27] Vaswani A,Shazeer N,Parmar N,et al.Attention is all you need[J].Advances in Neural Information Processing Systems,2017,30.
[28] Lawhern V J,Solon A J,Waytowich N R,et al.EEGNet:A compact convolutional neural network for EEG-based brain-computer interfaces[J].Journal of Neural Engineering,2018,15(5):056013.
[29] Song T,Zheng W,Song P,et al.EEG emotion recognition using dynamical graph convolutional neural networks[J].IEEE Transactions on Affective Computing,2018,11(3):532-541.
[30] Zhang X,Huang D,Li H,et al.Self-training maximum classifier discrepancy for EEG emotion recognition[J].CAAI Transactions on Intelligence Technology,2023,8(4):1480-1491.
[31] Zhong P,Wang D,Miao C.EEG-based emotion recognition using regularized graph neural networks[J].IEEE Transactions on Affective Computing,2020,13(3):1290-1301.
[32] Li X,Shen F,Peng Y,et al.Efficient sample and feature importance mining in semi-supervised EEG emotion recognition[J].IEEE Transactions on Circuits and Systems II:Express Briefs,2022,69(7):3349-3353.
[33] Zhou Y,Li F,Li Y,et al.Progressive graph convolution network for EEG emotion recognition[J].Neurocomputing,2023,544:126262.
[34] Yu X,Wang S H.Abnormality diagnosis in mammograms by transfer learning based on ResNet18[J].Fundamenta Informaticae,2019,168(2-4):219-230.
[35] Zheng W L,Liu W,Lu Y F,et al.EmotionMeter:A multimodal framework for recognizing human emotions[J].IEEE Transactions on Cybernetics,2019,49(3),1110-1122.
相似文献/References:
[1]谢照敏,陈勇明,郭 俊.基于CNN+LSTM混合神经模型的脑电情绪识别[J].成都信息工程大学学报,2026,41(02):147.[doi:10.16836/j.cnki.jcuit.2026.02.002]
XIE Zhaomin,CHEN Yongming,GUO Jun.Emotion Recognition from Electroencephalogram(EEG)Signals Using a Hybrid Convolutional and Long Short-term Memory Neural Network Model[J].Journal of Chengdu University of Information Technology,2026,41(02):147.[doi:10.16836/j.cnki.jcuit.2026.02.002]
备注/Memo
收稿日期:2024-08-07
项目基金:国家自然科学青年基金资助项目(1210010189); 四川省科技厅资助项目(2023NS FSC0071); 四川省科技厅资助项目(2023NSFSC1362)
通信作者:杜元花.E-mail:duyh@cuit.edu.cn
