The International Arab Journal of Information Technology (IAJIT)

..............................
..............................
..............................


Emotion Recognition based on EEG Signals in Response to Bilingual Music Tracks

Emotions are vital for communication in daily life and their recognition is important in the field of artificial intelligence. Music help evoking human emotions and brain signals can effectively describe human emotions. This study utilized Electroencephalography (EEG) signals to recognize four different emotions namely happy, sad, anger, and relax in response to bilingual (English and Urdu) music. Five genres of English music (rap, rock, hip-hop, metal, and electronic) and five genres of Urdu music (ghazal, qawwali, famous, melodious, and patriotic) are used as an external stimulus. Twenty-seven participants consensually took part in this experiment and listened to three songs of two minutes each and also recorded self- assessments. Muse four-channel headband is used for EEG data recording that is commercially available. Frequency and time-domain features are fused to construct the hybrid feature vector that is further used by classifiers to recognize emotional response. It has been observed that hybrid features gave better results than individual domains while the most common and easily recognizable emotion is happy. Three classifiers namely Multilayer Perceptron (MLP), Random Forest, and Hyper Pipes have been used and the highest accuracy achieved is 83.95% with Hyper Pipes classification method.


[1] Akputu O., Seng K., Lee Y., and Ang L., “Emotion Recognition Using Multiple Kernel Learning Toward E-Learning Applications,” ACM Transactions on Multimedia Computing, Communications, and Applications, vol. 14, no. 1, pp. 1-20, 2018.

[2] Alarcao S. and Fonseca M., “Emotions Recognition Using EEG Signals: A Survey,” IEEE Transactions on Affective Computing, vol. 10, no. 3, pp. 374-393, 2017.

[3] Ali H., Hariharan M., Yaacob S., and Adom A., “Facial Emotion Recognition Using Empirical Mode Decomposition,” Expert Systems with Applications, vol. 42, no. 3, pp. 1261-1277, 2015.

[4] Atkinson J. and Campos D., “Improving BCI- Based Emotion Recognition by Combining EEG Feature Selection and Kernel Classifiers,” Expert Systems with Applications, vol. 47, pp. 35-41, 2016.

[5] Bhatti A., Majid M., Anwar S., and Khan B., “Human Emotion Recognition and Analysis in Response to Audio Music Using Brain Signals,” Computers in Human Behavior, vol. 65, pp. 267- 275, 2016.

[6] Bo H., Ma L., Liu Q., Xu R., and Li H., “Music- Evoked Emotion Recognition Based on Cognitive Principles Inspired EEG Temporal and Spectral Features,” International Journal of Machine Learning and Cybernetics, vol. 10, no. 9, pp. 2439-2448, 2019.

[7] Bradley M. and Lang P., “International Affective Digitized Sounds (IADS),” Technical Report, University of Florida, 1998.

[8] Candra H., Yuwono M., Chai R., Nguyen H., and Su S., “EEG Emotion Recognition Using Reduced Channel Wavelet Entropy and Average Wavelet Coefficient Features with Normal Mutual Information Method,” in Proceedings of the International Conference of the IEEE Engineering in Medicine and Biology Society, JeJu, pp. 463-466, 2017.

[9] Cavallo F., Semeraro F., Fiorini L., Magyar G., Sinčák P., and Dario P., “Emotion Modelling For Social Robotics Applications: A Review,” Journal of Bionic Engineering, vol. 15, no. 2, pp. 185-203, 2018.

[10] Clancey W., Classification Problem Solving, Stanford University Stanford, 1984.

[11] Dan-Glauser E. and Scherer K., “The Geneva Affective Picture Database (GAPED): A New 730-Picture Database Focusing on Valence and Normative Significance,” Behavior Research Methods, vol. 43, no. 2, pp. 468-477, 2011.

[12] Ekman P. and Davidson R., The Nature of Emotion: Fundamental Questions, Oxford University Press, 1994.

[13] Salama E., El-Khoribi R., Shoman M., and Shalaby M., “EEG-Based Emotion Recognition using 3D Convolutional Neural Networks,” International Journal of Advanced Computer Emotion Recognition based on EEG Signals in Response to Bilingual Music Tracks 295 Science and Applications, vol. 9, no. 8, pp. 329- 337, 2018.

[14] Goshvarpour A., Abbasi A., and Goshvarpour A., “An Accurate Emotion Recognition System Using ECG and GSR Signals and Matching Pursuit Method,” Biomedical Journal, vol. 40, no. 6, pp. 355-368, 2017.

[15] Hassan M., Alam M., Uddin M., Huda S., Almogren A., and Fortino G., “Human Emotion Recognition Using Deep Belief Network Architecture,” Information Fusion, vol. 51, pp. 10-18, 2019.

[16] Huang H., Xie Q., Pan J., He Y., Wen Z., Yu R., and Li Y., “An EEG-Based Brain Computer Interface for Emotion Recognition and its Application in Patients with Disorder of Consciousness,” IEEE Transactions on Affective Computing, pp. 1-1, 2019.

[17] Jirayucharoensak S., Pan-Ngum S., and Israsena P., “EEG-based Emotion Recognition Using Deep Learning Network with Principal Component Based Covariate Shift Adaptation,” The Scientific World Journal, vol. 2014, 2014.

[18] Keelawat P., Thammasan N., Kijsirikul B., and Numao M., “Subject-Independent Emotion Recognition During Music Listening Based on EEG Using Deep Convolutional Neural Networks,” in Proceedings of the International Colloquium on Signal Processing and its Applications, Penang, pp. 21-26, 2019.

[19] Kim K., Bang S., and Kim S., “Emotion Recognition System Using Short-Term Monitoring of Physiological Signals,” Medical and Biological Engineering and Computing, vol. 42, no. 3, pp. 419-427, 2004.

[20] Krisnandhika B., Faqih A., Pumamasari P., and Kusumoputro B., “Emotion Recognition System Based on EEG Signals Using Relative Wavelet Energy Features and A Modified Radial Basis Function Neural Networks,” in Proceedings of the International Conference on Consumer Electronics and Devices, London, pp. 50-54, 2017.

[21] Lan Z., Sourina O., Wang L., Scherer R., and Müller-Putz G., “Unsupervised Feature Learning for EEG-Based Emotion Recognition,” in Proceedings of the International Conference on Cyberworlds, Chester, pp. 182-185, 2017.

[22] Lang P., “International Affective Picture System (IAPS): Affective Ratings of Pictures and Instruction Manual,” Technical Report, University of Florida, 2005.

[23] Lang P., “The Emotion Probe: Studies of Motivation and Attention,” American Psychologist, vol. 50, no. 5, pp. 372, 1995.

[24] Li X., Song D., Zhang P., Zhang Y., Hou Y., and Hu B., “Exploring EEG Features in Cross-Subject Emotion Recognition,” Frontiers in Neuroscience, vol. 12, pp. 162, 2018.

[25] MacIntyre P. and Vincze L., “Positive and Negative Emotions Underlie Motivation for L2 Learning,” Studies in Second Language Learning and Teaching, vol. 7, no. 1, pp. 61-88, 2017.

[26] Mahfuz N., Ismail W., Jali Z., Anuar K., and Nordin M., “Classification of Brainwave using Data Mining in Producing an Emotional Model,” Journal of Theoretical and Applied Information Technology, vol. 75, no. 2, pp. 128-136, 2015.

[27] Mehmood R. and Lee H., “A Novel Feature Extraction Method Based on Late Positive Potential for Emotion Recognition in Human Brain Signal Patterns,” Computers and Electrical Engineering, vol. 53, pp. 444-457, 2016.

[28] Mehmood R., Du R., and Lee H., “Optimal Feature Selection and Deep Learning Ensembles Method for Emotion Recognition from Human Brain EEG Sensors,” IEEE Access, vol. 5, pp. 14797-14806, 2017.

[29] Mert A. and Akan A., “Emotion Recognition from EEG Signals by Using Multivariate Empirical Mode Decomposition,” Pattern Analysis and Applications, vol. 21, no. 1, pp. 81- 89, 2018.

[30] Mirsamadi S., Barsoum E., and Zhang C., “Automatic Speech Emotion Recognition Using Recurrent Neural Networks with Local Attention,” in Proceedings of the International Conference on Acoustics, Speech and Signal Processing, New Orleans, pp. 2227-2231, 2017.

[31] Nasser A. and Sever H., “A Concept-based Sentiment Analysis Approach for Arabic,” The International Arab Journal of Information Technology, vol. 17, no. 5, pp. 778-788, 2020.

[32] Petrantonakis P. and Hadjileontiadis L., “A Novel Emotion Elicitation Index Using Frontal Brain Asymmetry for Enhanced EEG-Based Emotion Recognition,” IEEE Transactions on Information Technology in Biomedicine, vol. 15, no. 5, pp. 737-746, 2011.

[33] Plutchik R., “The Nature of Emotions: Human Emotions Have Deep Evolutionary Roots, A Fact That May Explain Their Complexity and Provide Tools for Clinical Practice,” American Scientist, vol. 89, no. 4, pp. 344-350, 2001.

[34] Qayyum H., Majid M., Anwar S., and Khan B., “Facial Expression Recognition using Stationary Wavelet Transform Features,” Mathematical Problems in Engineering, vol. 2017, 2017.

[35] Ramirez R., Planas J., Escude N., Mercade J., and Farriols C., “EEG-Based Analysis of The Emotional Effect of Music Therapy on Palliative Care Cancer Patients,” Frontiers in Psychology, vol. 9, pp. 254, 2018. 296 The International Arab Journal of Information Technology, Vol. 18, No. 3, May 2021

[36] Ramzan M. and Dawn S., “Learning Based Classification of Valence Emotion from Electroencephalography (EEG),” International Journal of Neuroscience, vol. 129, no. 11, pp. 1085-1093, 2019.

[37] Schomer D. and Da Silva F., Niedermeyer 's Electroencephalography: Basic Principles, Clinical Applications, and Related Fields, Lippincott Williams and Wilkins, 2012.

[38] Siuly S., Li Y., and Zhang Y., “EEG Signal Analysis and Classification,” IEEE Transactions on Neural Systems and Rehabilitaiton Engineering, vol. 11, pp. 141-144, 2016.

[39] Song T., Zheng W., Song P., and Cui Z., “EEG Emotion Recognition Using Dynamical Graph Convolutional Neural Networks,” IEEE Transactions on Affective Computing, vol. 11, no. 3, pp. 532-541, 2018.

[40] Takahashi K., “Remarks on Emotion Recognition From Bio-Potential Signals,” in Proceedings of the International Conference on Autonomous Robots and Agents, Palmerston North, pp. 186- 191, 2004.

[41] Teplan M., “Fundamentals of EEG Measurement,” Measurement Science Review, vol. 2, no. 2, pp. 1-11, 2002.

[42] Tong J., Liu S., Ke Y., Gu B., He F., Wan B., and Ming D., “EEG-based Emotion Recognition Using Nonlinear Feature,” in Proceedings of the International Conference on Awareness Science and Technology, Taichung, pp. 55-59, 2017.

[43] Wang X., Nie D., and Lu B., “EEG-based Emotion Recognition Using Frequency Domain Features and Support Vector Machines,” in Proceddings of the International Conference on Neural Information Processing, Shanghai, pp. 734-743, 2011.

[44] Xu Q., Zhou H., Wang Y., and Huang J., “Fuzzy Support Vector Machine for Classification of EEG Signals Using Wavelet-Based Features,” Medical Engineering and Physics, vol. 31, no. 7, pp. 858-865, 2009.

[45] Yoo G., Seo S., Hong S., and Kim H., “Emotion Extraction Based on Multi Bio-Signal Using Back-Propagation Neural Network,” Multimedia Tools and Applications, vol. 77, no. 4, pp. 4925- 4937, 2018. Rida Zainab completed her BSc. in Computer Engineering from University of Engineering and Technology (UET) Taxila, Pakistan in 2017 and received gold medal for her best performance. She is currently doing her MSc in Computer Engineering from UET Taxila, Pakistan. Muhammad Majid received BSc. in Computer Engineering with honors from University of Engineering and Technology (UET) Taxila, Pakistan in 2005, MSc. in Data Communications with distinction and PhD in Electronic and Electrical Engineering from the University of Sheffield, UK in 2007 and 2011 respectively. He is currently an Associate Professor at Department of Computer Engineering, UET Taxila. His research interests include video coding and emotion recognition.