A Comprehensive Study on Emotional State Analysis of Humans from EEG Signals

Authors

  • Charu Gitey M.Tech Scholar Department of Computer Science & Engineering Sagar Institute of Research Technology & Science- Excellence Bhopal, M.P, India
  • Dr. Kamlesh Namdev Associate Professor Department of Computer Science & Engineering Sagar Institute of Research Technology & Science- Excellence Bhopal, M.P, India

DOI:

https://doi.org/10.24113/ijoscience.v4i12.176

Abstract

Emotion plays an important role in the daily life of man and is an important feature of human interaction. Because of its role of adaptation, it motivates people to respond quickly to stimuli in their environment to improve communication, learning and decision making. With the increasing role of the brain-computer interface (BCI) in user-computer interaction, automatic recognition of emotions has become an area of interest in the last decade. The recognition of emotions could be facial expression, gesture, speech and text and could be recorded in different ways, such as electroencephalogram (EEG), positron emission tomography (PET), magnetic resonance imaging (MRI), etc. In this research work, feature extraction feature reduction and classification of emotions have been evaluated on different methods to recognize and classify different emotional states such as fear, sad, frustrated, happy, pleasant and satisfied from inner emotion EEG signals.

Downloads

Download data is not yet available.

References

M. Nicolaou, H. Gunes, and M. Pantic, “Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence-Arousal Space,” IEEE Transactions on Affective Computing, Vol. 2, No. 2, pp. 92–105, 2011.

T. Baltrusaitis, N. Banda, and P. Robinson, “Dimensional affect recognition using continuous conditional random fields,” in IEEE Int’ Conf. Automatic Face Gesture Recognition (FG), 2013, pp. 1–8.

X. Bao, S. Fan, A. Varshavsky, K. Li, and R. Roy Choudhury, “Your reactions suggest you liked the movie: Automatic content rating via reaction sensing,” in Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing, New York, NY, USA, 2013, Ubi Comp ’13, pp. 197–206, ACM.

D. Mc Duff, R. El Kaliouby, and R. W. Picard, “Crowdsourcing facial responses to online videos,” Affective Computing, IEEE Transactions on, Vol. 3, No. 4, pp. 456–468, 2012.

C. Chˆenes, G. Chanel, M. Soleymani, and T. Pun, “Highlight detection in movie scenes through inter-users, physiological linkage,” in Social Media Retrieval, N. Ramzan, R. van Zwol, J.-S. Lee, K. Cl ¨uver, and X.-S. Hua, Eds., Computer Communications and Networks, pp. 217–237. Springer London, 2013.

J. Fleureau, P. Guillotel, and I. Orlac, “Affective Benchmarking of Movies Based on the Physiological Responses of a Real Audience,” in Affective Computing and Intelligent Interaction and Workshops, 2013. ACII 2013. 3rd International Conference on, September 2013, pp. 73–77

J. J. M. Kierkels, M. Soleymani, and T. Pun, “Queries and tags in affect-based multimedia retrieval,” in ICME’09: Proceedings of the 2009 IEEE international conference on Multimedia and Expo, Piscataway, NJ, USA, 2009, pp. 1436–1439, IEEE Press.

S. Koelstra, C. Muhl, M. Soleymani, J.-S. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, and I. Y. Patras, “DEAP: A database for emotion analysis using physiological signals,” IEEE Transactions on Affective Computing, Vol. 3, pp. 18–31, 2012.

M. Soleymani, M. Pantic, and T. Pun, “Multimodal emotion recognition in response to videos,” Affective Computing, IEEE Transactions on, Vol. 3, No. 2, pp. 211–223, 2012.

M. K. Abadi, S. M. Kia, R. Subramanian, P. Avesani, and N. Sebe, “User-centric Affective Video Tagging from MEG and Peripheral Physiological Responses,” in Affective Computing and Intelligent Interaction and Workshops, 2013. ACII 2013. 3rd International Conference on, September 2013, pp. 582–587.

Ralph Adolphs, "Neural systems for recognizing emotion", Division of Cognitive Neuroscience, Department of Neurology, University of Iowa College of Medicine 2002.

Dr. R J Ramteke, Khachane Monali, " Automatic Classification and Abnormality Detection Using k- Nearest Neighbor", International Journal of Advanced Computer Research, 2012.

Changmok Oh, Min-Soeng and Ju-Jang Lee, "EEG signal classification on PCA and NN", SICE-ICASE International Joint Conference 2006.

Vaishnavi L. Kaundanya, Anita Patil, Ashish Panat, “Performance of K-NN Classifier for Emotion Detection Using EEG Signals”, IEEE, 2015.

Mostafa Mohammad pour, Seyyed Mohammad Reza Hashemi, Negin Houshmand, “Classi?cation of EEG-based Emotion for BCI Applications”, IEEE, 2017.

Wichakam I, Vateekul P. An evaluation of feature extraction in EEG-based emotion prediction with support vector machines. in International Joint Conference on Computer Science and Software Engineering 2014.

Koelstra S, et al., DEAP: A Database for Emotion Analysis; Using Physiological Signals. IEEE Transactions on Affective Computing, 2012. 3(1): 18-31.

Atkinson J, Campos D. Improving BCI-based emotion recognition by combining EEG feature selection and kernel classifiers. Expert Systems with Applications. 2016; 47: 35-41.

Mert A, Akan A. Emotion recognition from EEG signals by using multivariate empirical mode decomposition. Pattern Analysis & Applications. 2016: 1-9.

Liu W, Zheng WL, Lu BL. Emotion Recognition Using Multimodal Deep Learning. 2016. 521-529.

Mi Li, Hongpei Xu, Xingwang Liu,and Shengfu Lu, “Emotion recognition from multichannel EEG signals using K-nearest neighbor classification”, Technol Health Care. 2018, pp. 509–519.

Downloads

Published

12/13/2018

How to Cite

Gitey, C., & Namdev, D. . K. (2018). A Comprehensive Study on Emotional State Analysis of Humans from EEG Signals. SMART MOVES JOURNAL IJOSCIENCE, 4(12), 16–20. https://doi.org/10.24113/ijoscience.v4i12.176