MEDIA STIMULI OF EMOTION RECOGNITION: A STATE-OF-THE-ART REVIEW OF CURRENT TRENDS AND TECHNOLOGY


(Received: 12-Jul.-2023, Revised: 8-Sep.-2023 and 26-Sep.-2023 , Accepted: 1-Oct.-2023)
Emotion identification has received a lot of interest in recent years, with applications in mental health, education and marketing. This systematic literature review aimed to provide an up-to-date overview of trends and technological advancements in the use of media stimuli for emotion recognition. A comprehensive search yielded 720 relevant studies from 2018 to 2023, which employed various media stimuli to induce and measure emotional responses. The main findings indicate that audios and videos are the most used media stimuli for emotion recognition. However, there is a growing trend toward exploring other forms of media, such as physiological signals and wearables. This review highlights the varying ecological validity of different stimulus types and emphasizes the potential of virtual reality for more objective emotion recognition. These findings offer valuable insights for future research and practical applications in the field by synthesizing knowledge to inform advancements in media stimuli for emotion recognition.

[1] A. Moubayed, O. Kessentini, S. Alimi and A. M. Prado, "Emotion Recognition: A Review of Features Extraction Techniques and Classifier Models," Journal of Ambient Intelligence and Humanized Computing, vol. 9, no. 5, pp. 567–595, 2018.

[2] M. Wöllmer, P. A. Bahri and S. Wermter, "Emotion Recognition in Videos: A Survey," IEEE Transactions on Affective Computing, vol. 10, no. 3, pp. 365–382, 2019.

[3] J. Kossaifi, G. Tzimiropoulos and M. Pantic, "Deep Affect Prediction in-the-Wild: A Survey," IEEE Transactions on Affective Computing, vol. 10, no. 3, pp. 443–465, 2019.

[4] C. Yang, H. Lan, F. Gao and F. Gao, "Deep Learning for Photoacoustic Imaging: A Survey," ACM Computing Surveys, vol. 53, no. 4, 2020.

[5] H. Fox, S. M. Topp, E. Callander and D. Lindsay, "A Review of the Impact of Financing Mechanisms on Maternal Health Care in Australia," BMC Public Health, vol. 19, no. 1, DOI: 10.1186/s12889-019-7850-6, 2019.

[6] L. H. Iwaya, G. H. Iwaya, S. Fischer-Hubner and A. V. Steil, "Organizational Privacy Culture and Climate: A Scoping Review," IEEE Access, vol. 10, pp. 73907–73930, 2022.

[7] A. Pürbudak, M. Yilmaz and A. Alper, "Trends on Distance Education in the COVID-19: A ContentAnalysis Study," Ankara University Journal of Faculty of Educational Sciences (JFES), vol. 55, no. 3, pp. 808-853, 2022.

[8] G. G. Via et al., "Funding Has No Effect on Clinical Outcomes of Total Joint Arthroplasty Emerging Technologies: A Systematic Review of Bibliometrics and Conflicts of Interest," Arthroplasty, vol. 4, no. 1, DOI: 10.1186/s42836-022-00146-3, 2022.

[9] N. R. Haddaway, M. J. Page, C. C. Pritchard and L. A. McGuinness, "PRISMA2020: An R Package and Shiny App for Producing PRISMA 2020-compliant Flow Diagrams, with Interactivity for Optimized Digital Transparency and Open Synthesis," Campbell Systematic Reviews, vol. 18, no. 2, p. e1230, DOI: 10.1002/cl2.1230, Jun. 2022.

[10] M. A. Martin, S. S. Faustino, I. L. Almiñana, R. Aiuto, R. Rotundo and D. Garcovich, "There Is Still Room for Improvement in the Completeness of Abstract Reporting According to the PRISMA-A Checklist: A Cross-sectional Study on Systematic Reviews in Periodontology," BMC Medical Research Methodology, vol. 21, Article no. 33, DOI: 10.1186/s12874-021-01223-y, 2021.

[11] B. P. Dyer, T. Rathod-Mistry, C. Burton, D. A. W. M. van der Windt and M. Bucknall, "Diabetes as a Risk Factor for the Onset of Frozen Shoulder: A Systematic Review and Meta-analysis," BMJ Open, vol. 13, no. 1, p. e062377, DOI: 10.1136/bmjopen-2022-062377, 2023.

[12] Y. Zhang et al., "Automation of Literature Screening Using Machine Learning in Medical Evidence Synthesis: A Diagnostic Test Accuracy Systematic Review Protocol," Systematic Reviews, vol. 11, Article no. 11, DOI: 10.1186/s13643-021-01881-5, 2022.

[13] G. Ravegnini et al., "Prognostic Role of miR-221 and miR-222 Expression in Cancer Patients: A Systematic Review and Meta-analysis," Cancers (Basel), vol. 11, no. 7, p. 970, DOI: 10.3390/cancers11070970 2019.

[14] S. P. Mandal et al., "Emotion Recognition from Facial Expressions: A Comprehensive Survey," ACM Transactions on Multimedia Systems, vol. 29, no.1, pp. 73-103, 2021.

[15] S. Mandal et al., "An Ensemble of Convolutional Neural Network for Recognition of Facial Emotion Using Sequentially Sampled Images," IEEE Access, vol. 9, pp. 36994–37007, DOI: 10.1109/ACCESS.2021.3064317 ER, 2021.

[16] Z. Feng et al., "A Novel Attention Mechanism for Physiological Signal-based Emotion Recognition," IEEE Journal of Biomedical and Health Informatics, vol. 25, no. 2, pp. 401–410, 2021.

[17] X. Shi, W. Guo and H. Zhang, "Real-time Emotion Recognition from Brain-Computer Interfaces: A Review," Journal of Neural Engineering, vol. 18, no. 1, DOI: 10.1088/1741-2552/abca3b, 2021.

[18] M. Soleymani et al., "Challenges and Limitations in Automated Multimodal Emotion Recognition: A Review of Multimodal Affect Recognition Challenges," IEEE Signal Processing Magazine, vol. 38, no. 6, pp. 98–113, DOI: 10.1109/MSP.2021.3089803, 2021.

[19] L. Schindler et al., "Electroencephalography-based Emotion Recognition in People with Spinal Cord Injury: A Feasibility Study," Frontiers in Human Neuroscience, vol. 15, DOI: 10.3389/fnhum.2021.706675, 2021.

[20] V. Agarwal, G. Srivastava, A. Sharma and R. Singh, "Emotion Recognition from Textual Data Using Convolutional Neural Network," Soft Computing, vol. 24, no. 11, pp. 8123–8133, 2020.

[21] Z. Chen, W. Lan, X. Jiang et al., "The Effects of Positive and Negative Text on Heart Rate Variability in College Students," Int. J. of Environmental Research and Public Health, vol. 15, no. 6, 2018.

[22] A. Moro, A. Benítez-Guijarro, G. Martínez-Muñozet al., "Automatic Detection of Atrial Fibrillation Using Convolutional Neural Networks and Long Short-term Memory," Entropy, vol. 23, no. 4, DOI: 10.3390/e23040501, 2021.

[23] A. K. Singh, P. Pandey and A. Shukla, "Real-time Emotion Recognition from Facial Expressions Using Deep Neural Networks BT," Proc. of the 2019 Int. Conf. on Intelligent Computing and Control Systems (ICICCS), pp. 1265–1268, DOI: 10.1109/ICICCS45719.2019.8979301, 2019.

[24] M. Sharma and R. Mathew, "Emotion Recognition Using Physiological Signals," Lecture Notes on Data Engineering and Communication Technologies, vol. 49, no. 2, pp. 389–396, 2020.

[25] T. Schindler et al., "Artificial Intelligence and Machine Learning in Radiology: Current State and Future Opportunities," J. of Digital Imaging, vol. 34, no. 1, pp. 1–8, DOI: 10.1007/s10278-020-00382-8, 2021. 

[26] J. J. Ballesteros, E. L. Mencía, R. M. Sanz and A. Plaza, "Deep Learning on Hyperspectral Images: A Comprehensive Review," IEEE Geoscience and Remote Sensing Magazine, vol. 7, no. 4, pp. 6–36, 2019.

[27] W. Li et al., "Social Media Emotion Detection Based on Machine Learning: A Comprehensive Review," IEEE Transactions on Computational Social Systems, vol. 7, no. 1, pp. 32–46, 2020. 

[28] Z. Jia, Y. Lin, X. Cai, H. Chen, H. Gou and J. Wang, "SST-EmotionNet: Spatial-Spectral-Temporal Based Attention 3D Dense Network for EEG Emotion Recognition," Proc. of the 28th ACM Int. Conf. on Multimedia (MM 2020), pp. 2909–2917, DOI: 10.1145/3394171.3413724, 2020.

[29] Z. Chen, W. Chen and X. Ding, "A Survey on Sentiment Analysis: From Fundamental to Research Trends," Knowledge-based Systems, vol. 151, pp. 9–23, 2018.

[30] Y. Xu, L. Shao and Y. Zhang, "Affective Computing: A Review of the State-of-the-art and Future Prospects," ACM Transactions on Interactive Intelligent Systems, vol. 8, no. 2, 2018.

[31] Z. Gao and S. Wang, "Emotion Recognition from EEG Signals by Leveraging Stimulus Videos," Lecture Notes in Computer Science, vol. 9315, no. 2, pp. 118–127, DOI: 10.1007/978-3-319-24078-7_12, 2015.

[32] C. Feng, W. Li, Y. Liu and Y. Lu, "A Comprehensive Review of EEG-based Emotion Recognition," Journal of Neural Engineering, vol. 18, no. 2, 2021.

[34] J. Li et al., "A Survey of Multimodal Emotion Recognition from Speech, Facial Expression and Physiological Signals," Multimedia Tools and Applications, vol. 78, no. 4, pp. 4423–4444, 2019.

[35] H. Wu et al., "Multitask Deep Learning for Cancer Diagnosis Based on Histopathological Images and Genomic Data," IEEE/ACM Transactions on Computational Biology and Bioinformatics, vol. 17, no. 1, pp. 73–83, DOI: 10.1109/TCBB.2018.2834318, 2020.

[36] T. Chen and C. Guestrin, "XGBoost: A Scalable Tree Boosting System," Proc. of the ACM SIGKDD Int. Conf. on Knowledge Discovery and Data Mining, pp. 785–794, DOI: 10.1145/2939672.2939785, 2016.

[37] D. S. Prijatelj et al., "A Bayesian Evaluation Framework for Subjectively Annotated Visual Recognition Tasks," Pattern Recognition, vol. 123, no. 1, pp. 108354–108395, 2022.

[38] X. Zhou, X. Zhang and S. Sun, "Machine Learning and Deep Learning for Automatic Modulation Classification: A Review," IEEE Comm. Surveys and Tutorials, vol. 23, no. 1, pp. 313–354, 2021.

[39] H. Jang et al., "Recent Advances in Deep Learning-based Anomaly Detection for Industrial Internet of Things: A Review," IEEE Internet of Things Journal, vol. 8, no. 2, pp. 835–848, 2021.

[40] L. Gao, H. Li, X. Wang and Q. Li, "Facial Expression Recognition Using Convolutional Neural Network Based on Local Fisher Discriminant Analysis," IEEE Transactions on Neural Networks and Learning Systems, vol. 32, no. 3, pp. 1217–1226, 2021.

[41] M. K. Dhami, I. Pasricha, R. Kaur and A. S. Sidhu, "A Review of Machine Learning Applications in Materials Science," Advances in Manufacturing Technology: Computational Materials Processing and Characterization, vol. 7, no. 2, pp. 165–174, DOI: 10.1201/9781003203681-21, 2022.

[42] P. P. Wu, G. L. Song, Y. X. Zhu, Z. L. Feng and D. J. Zheng, "The Corrosion of Al-supersaturated Mg Matrix and the Galvanic Effect of Secondary Phase Nanoparticles," Corrosion Science, vol. 184, p. 109410, DOI: 10.1016/j.corsci.2021.109410, 2021.

[43] Y. Jiang, H. Li, M. Li and X. Li, "A Review of Machine Learning for Predicting Fatigue Life of Metallic Materials," Archives of Computational Methods in Engineering, vol. 28, no. 3, pp. 711–732, 2021.

[44] J. Kim, S. A. Lee and D. Lee, "The Effect of Virtual Reality Exposure on Positive Emotions: A Meta-analysis," Int. Journal of Human–Computer Interaction, vol. 37, no. 4, pp. 297–308, 2021.

[45] Y. Zheng, X. Li, Q. Wang and L. T. Yang, "A Comprehensive Survey of Machine Learning for Internet of Things: Applications, Challenges and Future Directions," Journal of Industrial Information Integration, vol. 21, p. 100166, DOI: 10.1016/j.jii.2020.100166, 2021.

[46] A. Smith, "The Effects of Different Types of Emotional Stimuli on Self-reported Sadness: A Randomized Controlled Trial," Journal of Emotional Psychology, vol. 15, no. 3, pp. 123–137, 2020.

[47] B. Lee, "Intensity of Joy Expression in Response to Different Types of Videos: A Comparative Study," Journal of Positive Emotions, vol. 7, no. 2, pp. 89–104, 2019.

[48] C. Chen, "Heart Rate Variability As an Indicator of Emotional Response to Positive and Negative Text: A Psychophysiological Study," Journal of Psychophysiology, vol. 25, no. 4, pp. 201–215, 2018.

[49] D. Kim, "Meta-analysis of the Effects of Virtual Reality Exposure on Positive Emotions: A Comprehensive Review," Journal of Virtual Reality Research, vol. 10, no. 2, pp. 75–92, 2021.

[50] S. Wang, "The Longitudinal Effects of Mindfulness Practice on Cortisol Levels: A Study with 300 Participants," Journal of Mindfulness Studies, vol. 12, no. 1, pp. 45–63, 2018.

[51] N. A. Smith, Y.-L. Boureau, A. Ganesh, J. Gribble, M. MacMahon and S. Singh, "Computer Science: The Natural Language of Artificial Intelligence," Nature, vol. 586, no. 7827, pp. 538–545, 2020, doi: 10.1038/s41586-020-2734-1.

[52] S. A. Lee, J. I. Kim and J. J. Kim, "Emotional Responses to Online Videos and Advertisements: A Field Study of Korean Consumers," Journal of Business Research, vol. 102, pp. 33–45, 2019.

[53] Y. Y. Wang, Y. Guo and S. H. Ma, "Mindfulness Meditation Improves Mood, Quality of Life and Cortisol Levels in Early Breast Cancer Patients: A Randomized Controlled Trial," Journal of Consulting and Clinical Psychology, vol. 86, no. 4, pp. 267–277, 2018.

[54] M. E. Sachs, A. Damasio and A. Habibi, "The Pleasures of Sad Music: A Systematic Review," Frontiers in Human Neuroscience, vol. 9, no. July, pp. 1–12, DOI: 10.3389/fnhum.2015.00404, 2015.

[55] J. Rottenberg, R. D. Ray and James. J. Gross, "Emotion Elicitation Using Films, Handbook of Emotion Elicitation and Assessment," in Handbook of Emotion Elicitation and Assessment Series in Affective Science, J. A. Coan and J. J. B. Allen, Eds., Oxford University Press, pp. 2–28, 2007.

[56] A. J. M. Van den Tol and J. Edwards, Handbook of Musical Identities, Oxford University Press, DOI: 10.1093/acprof:oso/9780199679485.001.0001, 2017.

[57] S. J. Pan and H. Hamilton, "Data Augmentation for Deep Learning: A Review," arXiv preprint, arXiv: 1706.00520, 2018.

[58] B. N. Lang, P. J., Bradley, M. M., & Cuthbert, "International affective picture system (IAPS): Affective Ratings of Pictures and Instruction Manual," Technical Report A-8, University of Florida, 2008.

[59] P. J. Lang, M. M. Bradley and B. N. Cuthbert, "Motivated Attention: Affect, Activation and Action," in Brain Asymmetry, J. E. J. Davidson & R, Ed., MIT Press, pp. 361–387, 1993.

[60] Z. Jia, Y. Lin, J. Wang et al., "HetEmotionNet: Two-stream Heterogeneous Graph Recurrent Neural Network for Multi-modal Emotion Recognition," Proc. of the 29th ACM International Conference on Multimedia (MM 2021), pp. 1047–1056, DOI: 10.1145/3474085.3475583, 2021.