Please use this identifier to cite or link to this item:
https://ptsldigital.ukm.my/jspui/handle/123456789/487014
Title: | Human emotion recognition using multimodal psychophysiological signals |
Authors: | Khairun Nisa Mihat (Minhad) (P73135) |
Supervisor: | Sawal Hamid Md Ali, Assoc. Prof. Dr. |
Keywords: | Psychophysiology Emotions Traffic accidents Universiti Kebangsaan Malaysia -- Dissertations Dissertations, Academic -- Malaysia |
Issue Date: | 5-Aug-2019 |
Description: | Emotionally distracted driving can lead to aggressive driving and reported as the most common cause of road accident. Recognising the patterns of psychophysiological changes is the key of intelligent vehicle systems in mitigating the consequences of emotions. Therefore, a warning sign or some tools with which individuals can control their emotions whilst behind the wheel help to overcome aggressive driving behaviour. This work involves emotion elicitations process and the classification of the emotions from a set of stimuli. In this research, a new protocol of experimental work, effective emotion stimuli presented to 69 subjects and efficient dimensionality reduction technique are proposed to achieve high emotion classification accuracy. A survey was conducted to gather data on the efficiency of the proposed emotion stimulus. Results showed that the proposed emotion stimulus obtained 80.34% efficacy compared with International Affective Picture System (IAPS) of University Florida database stimuli which obtained 65.69%. The physiological changes caused by the autonomic nervous system (ANS) activities were measured simultaneously through electrocardiogram (ECG), electrodermal activity (EDA) and electromyography (EMG) signals. Thus, activity changes in the heart, on the skin surface and on the face muscles could be measured. The signal data were recorded from 69 subjects recruited from Universiti Kebangsaan Malaysia. Effective noise removal was employed and a fixed occurrence of an event size segmentation technique was used to preserve the signal information. Subsequently, a time domain, time-frequency domain, short-time Fourier transform, and stationary wavelet transform techniques were used to extract meaningful signal features. With use of a stringent two-way analysis of variance (ANOVA), features with significant differences were statistically compared with the emotion groups. A comparative study was performed to test the most suitable classifier for the data distribution of this work, namely, support vector machine (SVM), k-nearest neighbour (k-NN), and Naïve Bayes. The SVM constantly outperformed the other methods. Thus, SVM was selected in this study for further analysis. The multi-dimensional samples projection using a modified single scalar approach of linear discriminant analysis technique was proposed to increase the accuracy of the classification. The average emotion classification results showed that the accuracy of the unimodal signals using ECG, EDA and EMG were 95.74%, 88.95% and 78.19%, respectively, whereas the average accuracy of multi-modal signals was 96.53%. A self-assessment survey was conducted after each session to collect the degree of feelings of the subjects after they watched the stimuli. Results showed that the proposed emotions stimuli obtained 81.89%-93.04% efficacy for the negative emotion and 78.81%-82.9% for the positive emotion. The self-assessment survey data indicated that audio-visual stimuli were an efficient approach to emotion elicitation, with 93.04%. The findings demonstrated the capability of the proposed stimuli, dimensionality reduction technique along with the use of appropriate features to contribute to high emotion classification accuracy.,Ph.D. |
Pages: | 208 |
Call Number: | QP360.K485 2019 3 tesis |
Publisher: | UKM, Bangi |
Appears in Collections: | Faculty of Engineering and Built Environment / Fakulti Kejuruteraan dan Alam Bina |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
ukmvital_120828+SOURCE1+SOURCE1.2.PDF Restricted Access | 1.17 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.