2009 IEEE International Conference on
Systems, Man, and Cybernetics |
![]() |
Abstract
Determining correlation between aroused emotion and its manifestation on facial expression, voice, gesture and posture have interesting applications in psychotherapy. A set of audiovisual stimulus, selected by a group of experts, is used to excite emotion of the subjects. EEG and facial expression of the subjects excited by the selected audio-visual stimulus are collected, and the nonlinear-correlation from EEG to facial expression, and viceversa is obtained by employing feed-forward neural network trained with back-propagation algorithm. Experiments undertaken reveals that the trained network can reproduce the correlated EEG-facial expression trained instances with 100 % accuracy, and is also able to predict facial expression (EEG) from unknown EEG (facial expression) of the same subject with an accuracy of around 95.2%.