..............................
..............................
..............................
Person-Independent Emotion and Gender Prediction (EGP) System Using EEG Signals
This paper presents a person-independent Emotion and Gender Prediction (EGP) system using
Electroencephalography (EEG) brain signals. First, Short Time Fourier Transform (STFT) technique is implemented to get the
time-frequency information for the selected electrode (Fz Electrode). Then, it is splitted into twenty sequential batches according
to the electrode recorded time in seconds, and the maximum EEG activation voltage is located for every frequency level within
each batch to create a 2D time-frequency extraction feature. Next, sparse auto encoder is applied to convert the distribution of
the extracted feature into more compact and distinguished one instead. For system evaluation, Human-Computer Interaction)
database (MAHNOB-HCI) public dataset with five-fold-cross validation classifier are used and implemented. In experiments,
the proposed extracted feature improves the results of both emotion and gender prediction. For emotion prediction, the highest
average accuracy is 97.07\%, 93.27% and 91.72\% for three, four and six emotions with Convolutional Neural Network (CNN)
classifier, respectively. While, for gender prediction, experiments are tested related to neutral, amusement, happy, sad, and the
mix of all these emotions, the highest average accuracy is obtained with CNN classifier in all emotion states (>95%) including
the state of mixing all emotions together. As well as, the ability to distinguish between genders in case of mixing different
emotions together is practically approved.
[1] Alarcao S. and Fonseca M., “Emotions Recognition Using EEG Signals: A Survey,” IEEE Transactions on Affective Computing, vol. PP, no. 99, pp. 1-20, 2017.
[2] Al-Nafjan A., Hosny M., Al-Ohali Y., and Al- Wabil A., “Review and Classification of Emotion Recognition Based on EEG Brain-Computer Interface System Research: A Systematic Review,” Applied Sciences, vol. 7, no. 12, pp. 1239, 2017.
[3] Al-Qazzaz N., Sabir M., Ali S., Ahmad S., and Grammer K., “The Role of Spectral Power Ratio in Characterizing Emotional EEG for Gender Identification,” in Proceedings of IEEE-EMBS Conference on Biomedical Engineering and Sciences (IECBES), Langkawi Island, pp. 334- 338, 2020.
[4] Al-Qazzaz N., Sabir M., Ali S., Ahmad S., and Grammer K., “Multichannel Optimization with Hybrid Spectral- Entropy Markers for Gender Identification Enhancement of Emotional-Based EEGs,” IEEE Access, vol. 9, pp. 107059- 107078, 2021.
[5] AL-Shargie F., Tariq U., Alex M., Mir H., and AL-Nashash H., “Emotion Recognition Based on Fusion of Local Cortical Activations and Dynamic Functional Networks Connectivity: An EEG Study,” IEEE Access, vol. 7, pp. 143550- 143562, 2019.
[6] Al-Wajih E. and Ahmed M., “A New Application for Gabor Filters in Face-Based Gender Classification,” The International Arab Journal of Information Technology, vol. 17, no. 2, pp. 178- 187, 2020.
[7] Bansal A., Agarwal R., and Sharma R., “SVM Based Gender Classification Using Iris Images,” in Proceedings of the 4th International Conference on Computational Intelligence and Communication Networks, Mathura, pp. 425-429, 2012.
[8] Boashash B., Time-Frequency Signal Analysis and Processing, Oxford, 2016.
[9] Boyd S., Parikh N., Chu E., Peleato B., and Eckstein J., “Distributed Optimization and Statistical Learning Via The Alternating Direction Method of Multipliers,” Found Trends Machine Learning, vol. 3, no. 1, pp. 1-12, 2011.
[10] Chen C. and Liu Z., “Broad Learning System: An Effective and Efficient Incremental Learning System without the Need for Deep Architecture,” IEEE Transactions on Neural Networks and Learning Systems, vol. 29, no. 10, pp. 10-24, 2018.
[11] Farnsworth B., “What is EEG?,” at: https://imotions.com, Last Visited, 2021.
[12] Gunes H. and Pantic M., “Automatic, Dimensional and Continuous Emotion Recognition,” International Journal of Synthetic Emotions, vol. 1, no. 1, pp. 68-99, 2010.
[13] Habib Ullah H., Uzair M., Mahmood A., Ullah M., Khan S., and Cheikh F., “Internal Emotion Classification Using EEG Signal With Sparse Discriminative Ensemble,” IEEE Access, vol. 7, pp. 40144-40153, 2019.
[14] Issa S., Peng Q., You X., and Shah W., “Emotion Assessment Using EEG Brain Signals and Stacked Sparse Autoencoder,” Journal of Information Assurance and Security, vol. 14, pp. 20-29, 2019.
[15] Issa S., Peng Q., and You X., “Emotion Classification Using EEG Brain Signals and the Broad Learning System,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 51, no. 21, pp. 7382-7391, 2020.
[16] Issa H., Peng Q., and Issa S., “A Novel Method for Gender and Age Detection Based on EEG Brain Signals,” The International Arab Journal of Information Technology, vol. 18, no. 5, pp. 1-8, 2021.
[17] Johnson J., Karpathy A., and Git R., “Fully Connected Nets, Batch Normalization, Dropout, Convolutional Nets,” Assignment 2, Computer Science Department, Stanford University, California, at: https://cs231n.github.io/assignments2022/assign ment2/, Last Visited, 2022.
[18] Jonathan A., Huang S., Ye M., Nabili M., Bryan Wilent W., Krauthamer V., Matthew R., and Cristin G., “Real-Time Detection and Monitoring of Acute Brain Injury Utilizing Evoked Electroencephalographic Potentials, IEEE TNSRE, vol. 24, no. 9, 2016.
[19] Kaur B., Singh D., and Roy P., “Age and Gender Classification Using Brain-Computer Interface,” Neural Computing and Applications, vol. 31, no. 15, pp. 5887-5900, 2019.
[20] Kaushik P., Gupta A., Roy P., and Dogra D., “EEG-Based Age and Gender Prediction Using Deep BLSTM-LSTM Network Model,” IEEE Sensors Journal, vol. 19, no. 7, pp. 2634-2641, 2019.
[21] Kim S. and Choi J., “An SVM-Based High- Quality Article Classifier for Systematic Reviews,” Journal of Biomedical Informatics, vol. 47, pp. 153-159, 2014.
[22] Li Y., Huang J., Zhou H., and Zhong N., “Human Emotion Recognition with Electroencephalographic Multidimensional Features by Hybrid Deep Neural Networks,” Sensors, vol. 7, no. 10, pp. 1-20, 2017.
[23] Lin C., Wu C., and Chaovalitwongse W., “Integrating Human Behavior Modeling and Data Mining Techniques to Predict Human Errors in Numerical Typing,” IEEE Transactions on Human-Machine Systems, vol. 45, no. 1, pp. 39- 50, 2015. Person-Independent Emotion and Gender Prediction (EGP) System Using EEG Signals 637
[24] Liu H., He L., Shang Sh., Lu J., Wan M. and Liu S., “Convolutional Neural Network for Resting State EEG Signal Analysis,” in Proceedings of the 16th International Conference on Networking, Sensing and Control, Banff, pp. 138-143, 2019.
[25] Liu J., Meng H., Li M., Zhang F., Qin R., and Nandi A., “Emotion detection from EEG Recordings Based on Supervised and Unsupervised Dimension Reduction, Concurrency and Computation: Practice and Experience,” Wiley Online Library, vol. 30, no. 23, 2018.
[26] Loh W., “On Linear Discriminant Analysis with Adaptive Ridge Classification Rules,” Journal of Multivariate Analysis, vol. 53, no. 2, pp. 264-278, 1995.
[27] Lu L., Xu Z., and Shi P., “Gender Classification of Facial Images Based on Multiple Facial Regions,” in Proceedings of WRI World Congress on Computer Science and Information Engineering, Los Angeles, Los Angeles, pp. 48- 52, 2009.
[28] Meena K., Subramaniam K., and Gomathy M., “Gender Classification in Speech Recognition using Fuzzy Logic and Neural Network,” The International Arab Journal of Information Technology, vol. 10, no. 5, pp. 477-485, 2013.
[29] Nguyen P., Tran D., Huang X., and Ma W., “Age and Gender Classification Using EEG Paralinguistic Features,” in Proceedings International IEEE EMBS Conference on Neural Engineering, San Diego, pp. 1295-1298, 2013.
[30] Olshausen B. and Field D., “Sparse Coding with an over Complete Basis Set: A Strategy Employed by V1,” Vision Research, vol. 37, no. 23, pp. 3311-3325, 1997.
[31] Oral E., Ozbek I., and Çodur M., “Gender Clasification Based on Single Channel EEG Signal,” in Proceedings International Artificial Intelligence and Data Processing Symposium (IDAP), Malatya, pp. 1-4, 2017.
[32] Qing Ch., Qiao R., Xu X., and Cheng Y., “Interpretable Emotion Recognition Using EEG Signals,” IEEE Access, vol. 7, pp. 94160- 94170, 2019.
[33] Sharma R., Varshney P., Pachori R., and Vishvakarma S., “Automated System for Epileptic EEG Detection Using Iterative Filtering,” IEEE Sensors Letters, vol. 2, no. 4, pp. 2472-2475, 2018.
[34] Soleymani M., Pantic M., and Pun T., “Multimodal Emotion Recognition in Response to Videos,” in Proceedings of International Conference on Affective Computing and Intelligent Interaction ACII, Xian, China, pp. 491- 497, 2015.
[35] Soleymani M., Lichtenauer J., Pun T., and Pantic M., “A Multimodal Database for Affect Recognition and Implicit Tagging,” IEEE Transactions on Affective Computing, vol. 3, no. 1, pp. 42-55, 2012.
[36] “The Biomedical Signals Acquisition: EEG,” at: https://www.medicine.mcgill.ca/physio/vlab/bio med_signals, Last Visited, 2021.
[37] “The EEG: Electrode Positions and Broadmann atlas,” at: https://www.cnblogs.com, Last Visited, 2021.
[38] Wang S., Zhu Y., Yue L., and Ji Q., “Emotion Recognition with the Help of Privileged Information,” IEEE Transactions on Autonomous Mental Development, vol. 7, no. 3, pp. 189-200, 2015.
[39] Xing B., Zhang H., Zhang K., Zhang L. Wu X., Shi X., Yu S., and Zhang S., “Exploiting EEG Signals and Audiovisual Feature Fusion for Video Emotion Recognition,” IEEE Access, vol. 7, pp. 59844-59861, 2019.
[40] Yu S., Yu Q., Luo J., Wu B., Yang F., Li Y., Zhuang Y., Zheng X., and Chen W., “A Hybrid Brain-Computer Interface Control Strategy in A Virtual Environment,” Journal of Zhejiang University-SCIENCE C (Computers and Electronics), vol. 12, pp. 351-361, 2011.
[41] Yulita I., Julviar R., Triwahyuni A., and Widiastuti T., “Multichannel Electroencephalography-based Emotion Recognition Using Machine Learning,” in Proceedings International 2nd International Conference on Mechanical, Electronics, Computer, and Industrial Technology, Journal of Physics, Medan, North Sumatera, Indonesia, pp. 1-8, 2018.
[42] Zhang Z., “Introduction to Machine Learning: K- Nearest Neighbors,” Annals of Translational Medicine, vo. 4, no. 11, 2016. 638 The International Arab Journal of Information Technology, Vol. 19, No. 4, July 2022 Haitham Issa received his Ph.D. in Communications and Information systems from Zhejiang University, Hangzhou, China, in 2002. He is currently teaching in the Department of Electrical Engineering in Zarqa University, Zarqa, Jordan. His main research interests include machine learning, signal processing, electronics and communications. Qinmu Peng received the Ph.D. in Computer Science, Hong Kong Baptist University, Hong Kong, in 2015. He is currently teaching in the Department of Electrical Information and Communications, Huazhong University of Science and Technology, Wuhan, China. His main research interests include machine learning, medical image processing, and pattern recognition. Sali Issa received her Ph.D. in Electrical Information and Communication Engineering from Huazhong University of Science and Technology, Wuhan, China, in 2020. She is currently teaching in the Department of Electrical Information of Science and Technology in Hubei University of Education, Wuhan, China. Her main research interests include artificial intelligence, signal processing, and brain computer interface. Xinge You received the Ph.D. degree in computer Science from Hong Kong Baptist University, Hong Kong, in 2004. He is currently a professor in the School of Electrical Information and Communications, Huazhong University of Science and Technology, Wuhan, China. Ruijiao Peng graduated from Hubei University of Traditional Chinese Medicine in 1995. She is currently a doctor at Baiguo Hospital, Macheng City, China. Jing Wang received Ph.D. degree in Radiology from Huazhong University of Science and Technology, Wuhan, China, in 2010. She is currently an associate professor in Radiology Department in Union hospital, Wuhan, China.