JDA-RSDB: a multimodal domain adaptation method for cross-session emotion recognition from EEG and eye movement signals

An Y, Hu S, Liu S, Wang X, Gu Z, Zhang Y (2025) LGDAAN-Nets: a local and global domain adversarial attention neural networks for EEG emotion recognition. Knowl Based Syst. https://doi.org/10.1016/j.knosys.2025.11361

Article  Google Scholar 

Ben-David S, Blitzer J, Crammer K, Kulesza A, Pereira F, Vaughan JW (2010) A theory of learning from different domains. Mach Learn 79(1):151–175

Article  Google Scholar 

Chen Y, Xu X, Qin X (2025) Cross-subject and cross-session EEG emotion recognition based on multi-source structural deep clustering. IEEE Trans Cogn Dev Syst 17(5):1245–1259

Article  Google Scholar 

Cheng C, Cai X, Qi H, Chen W, Zhang Y (2026) MSDA-Net: multi-source domain adaptive network for multi-modal emotion recognition. ACM Trans Asian Low Resour Lang Inf Process 25(1):1–22

Article  Google Scholar 

Duan R, Zhu J, Lu B (2013) Differential entropy feature for EEG-based emotion classification. In: 2013 6th International IEEE/EMBS conference on neural engineering (NER), pp 81–84

Fu B, Chu W, Gu C, Liu Y (2024) Cross-modal guiding neural network for multimodal emotion recognition from EEG and eye movement signals. IEEE J Biomed Health Inf 28(10):5865–5876

Article  Google Scholar 

Gong X, Chen CP, Hu B, Zhang T (2024) CiABL: completeness-induced adaptative broad learning for cross-subject emotion recognition with EEG and eye movement signals. IEEE Trans Affect Comput 15(4):1970–1984

Article  Google Scholar 

Gong L, Chen W, Li M, Zhang T (2024) Emotion recognition from multiple physiological signals using intra-and inter-modality attention fusion network. Digit Signal Process 144:104278

Article  Google Scholar 

Gu X, Sun J, Xu Z (2020) Spherical space domain adaptation with robust pseudo-label loss. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 9101–9110

Guo Z, Yang M, Lin L, Li J, Zhang S, He Q, Gao J, Meng H, Chen X, Tao Y et al (2024) E-MFNN: an emotion-multimodal fusion neural network framework for emotion recognition. PeerJ Comput Sci 10:1977

Article  Google Scholar 

Haeusser P, Frerix T, Mordvintsev A, Cremers D (2017) Associative domain adaptation. In: Proceedings of the IEEE international conference on computer vision, pp 2765–2773

Jiménez-Guarneros M, Fuentes-Pineda G (2024) CFDA-CSF: a multi-modal domain adaptation method for cross-subject emotion recognition. IEEE Trans Affect Comput 15(3):1502–1513

Article  Google Scholar 

Jiménez-Guarneros M, Fuentes-Pineda G, Grande-Barreto J (2024) MMDA: a multimodal and multisource domain adaptation method for cross-subject emotion recognition from EEG and eye movement signals. IEEE Trans Comput Soc Syst 12(5):2214–2227

Article  Google Scholar 

Jin M, Li J (2023) Graph to grid: learning deep representations for multimodal emotion recognition. In: Proceedings of the 31st ACM international conference on multimedia, MM ’23. Association for Computing Machinery, New York, pp 5985–5993

Joyce JM (2011) Kullback–Leibler divergence. Springer, Berlin, pp 720–722

Kim T, Kim C (2020) Attract, perturb, and explore: learning a feature alignment network for semi-supervised domain adaptation. In: European conference on computer vision. Springer, pp 591–607

Lan Y-T, Liu W, Lu B-L (2020) Multimodal emotion recognition using deep generalized canonical correlation analysis with an attention mechanism. In: 2020 International joint conference on neural networks (IJCNN), pp 1–6

Lei C, Chen CP, Zhang T (2025) AM-ConvBLS: adaptive manifold convolutional broad learning system for cross-session and cross-subject emotion recognition. IEEE Trans Affect Comput. https://doi.org/10.1109/TAFFC.2025.3565570

Article  Google Scholar 

Li T-H, Liu W, Zheng W-L, Lu B-L (2019) Classification of five emotions from EEG and eye movement signals: discrimination ability and stability over time. In: 2019 9th International IEEE/EMBS conference on neural engineering (NER). IEEE, pp 607–610

Liu W, Qiu J-L, Zheng W-L, Lu B-L (2021) Comparing recognition performance and robustness of multimodal deep learning models for multimodal emotion recognition. IEEE Trans Cogn Dev Syst 14(2):715–729

Article  Google Scholar 

Liu W, Zheng W-L, Li Z, Wu S-Y, Gan L, Lu B-L (2022) Identifying similarities and differences in emotion recognition with EEG and eye movements among Chinese, German, and French people. J Neural Eng 19(2):026012

Article  Google Scholar 

Liu M, Guan D, Zheng C, Zhu Q (2025) Multi-modal discriminative network for emotion recognition across individuals. IEEE Trans Cogn Dev Syst 17(5):1323–1335

Article  Google Scholar 

Luo G, Han Y, Xie W, Tian F, Zhu L, Qian K, Li X, Sun S, Hu B (2025) GCD-JFSE: graph-based class-domain knowledge joint feature selection and ensemble learning for EEG-based emotion recognition. Knowl Based Syst 309:112770

Article  Google Scholar 

Montavon G, Orr GB, Müller K (eds) (2012) Neural networks: tricks of the trade, 2nd edn. Lecture notes in computer science, vol 7700. Springer, Berlin

Saito K, Watanabe K, Ushiku Y, Harada T (2018) Maximum classifier discrepancy for unsupervised domain adaptation. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 3723–3732

Saito K, Kim D, Sclaroff S, Darrell T, Saenko K (2019) Semi-supervised domain adaptation via minimax entropy. In: Proceedings of the IEEE/CVF international conference on computer vision, pp 8050–8058

Wang M, Deng W (2018) Deep visual domain adaptation: a survey. Neurocomputing 312:135–153

Article  Google Scholar 

Wang Z, Zhao M (2025) Dynamic domain adaptive EEG emotion recognition based on multi-source selection. Rev Sci Instrum 96(1):015103

Article  CAS  PubMed  Google Scholar 

Xiao Y, Zhang Y, Peng X, Han S, Zheng X, Fang D, Chen X (2025a) Multi-source EEG emotion recognition via dynamic contrastive domain adaptation. Biomed Signal Process Control 102:107337

Article  Google Scholar 

Xiao Z, She Q, Fang F, Meng M, Zhang Y (2025b) Auxiliary classifier adversarial networks with maximum subdomain discrepancy for EEG-based emotion recognition. Med Biol Eng Comput 63(11):3153–3167

Article  PubMed  Google Scholar 

Yang Y, Wang Z, Tao W, Liu X, Jia Z, Wang B, Wan F (2024) Spectral-spatial attention alignment for multi-source domain adaptation in EEG-based emotion recognition. IEEE Trans Affect Comput 15(4):2012–2024

Article  Google Scholar 

Yin Y, Kong W, Tang J, Li J, Babiloni F (2024) Pspn: pseudo-siamese pyramid network for multimodal emotion analysis. Cogn Neurodyn 18(5):2883–2896

Article  PubMed  PubMed Central  Google Scholar 

Yu P, He X, Li H, Dou H, Tan Y, Wu H, Chen B (2025) Fmlan: a novel framework for cross-subject and cross-session EEG emotion recognition. Biomed Signal Process Control 100:106912

Article  Google Scholar 

Zhang J-M, Liu J, Li Z, Ma T-F, Wang Y, Zheng W-L, Lu B-L (2023) Naturalistic emotion recognition using EEG and eye movements. In: International conference on neural information processing. Springer, Berlin, pp 265–276

Zhang Y, Liu H, Wang D, Zhang D, Lou T, Zheng Q, Quek C (2024) Cross-modal credibility modelling for EEG-based multimodal emotion recognition. J Neural Eng 21(2):026040

Article  Google Scholar 

Zheng W-L, Lu B-L (2015) Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans Auton Ment Dev 7(3):162–175

Article  Google Scholar 

Zheng W-L, Liu W, Lu Y, Lu B-L, Cichocki A (2018) Emotionmeter: a multimodal framework for recognizing human emotions. IEEE Trans Cybern 49(3):1110–1122

Article  PubMed  Google Scholar 

Zhong X-C, Wang Q, Li R, Liu Y, Duan S, Yang R, Liu D, Sun J (2025) Unsupervised domain adaptation with pseudo-label propagation for cross-domain EEG emotion recognition. IEEE Trans Instrum Meas 74:1–11

Google Scholar 

Zhu S, Qi J, Hu J, Hao S (2022) A new approach for product evaluation based on integration of EEG and eye-tracking. Adv Eng Inform 52:101601

Article  Google Scholar 

Zhu L, Yu F, Huang A, Ying N, Zhang J (2024a) Instance-representation transfer method based on joint distribution and deep adaptation for EEG emotion recognition. Med Biol Eng Comput 62(2):479–493

Article  PubMed  Google Scholar 

Zhu M, Wu Q, Bai Z, Song Y, Gao Q (2024b) EEG-eye movement based subject dependence, cross-subject, and cross-session emotion recognition with multidimensional homogeneous encoding space alignment. Expert Syst Appl 251:124001

Article  Google Scholar 

Zhu Q, Zhu T, Fei L, Zheng C, Shao W, Zhang D, Zhang D (2025) Multi-modal cross-subject emotion feature alignment and recognition with EEG and eye movements. IEEE Trans Affect Comput 16(3):2102–2115

Article  Google Scholar 

Comments (0)

No login
gif