It appears you don't have support to open PDFs in this web browser. To view this file, Open with your PDF reader
Abstract
The present study attempted to induce spontaneous emotions in individuals to determine whether specific pattern changes in facial movement could be determined from the frontal and profile views. A total of 142 volunteers (males = 50, and females = 92) aged between 18 to 34 years were presented with a series of short videos to induce one of three emotions: amusement, sadness, and fear. Their facial behaviours were recorded from three different facial positions (left, right, and frontal views) along with their physiological skin conductance response. After the viewing of each short film, each subject completed a self-report questionnaire to indicate what emotions they felt during certain scenes of the film.
Self-reports revealed that discrete emotions of amusement and sadness were experienced in 79.58% and 83.8% of participants compared to discrete fear which was elicited in only 50% of participants. Skin conductance responses from electrodermal activity (EDA) readings were observed during the expressions of amusement and fear whilst a mixed response pattern was seen for sadness. These skin conductance responses to sadness were marked by a reduction in tonic EDA, seen as either the presence or absence of skin conductance. As such, it was relatively difficult (particularly for sadness) to precisely reflect emotional changes by using a single physiological signal.
When it came to the behavioural level of expression, subjects displayed a variety of action units (AUs) that produced similar appearance changes that were uniquely associated with an emotion. The Lucas-Kanade optical flow and facial landmark feature extraction techniques were implemented to quantify the facial changes that occurred between onset (baseline) and peak of emotional expression. Classification using feature vectors indicated that landmark extraction method outperformed optical flow analysis achieving predictive accuracy recognitions of 71-75%, 70-72% and 70-71% compared to 49-64%, 60-68% and 64-68% in optical flow for the frontal, profile left and right facial views, respectively. Where amusement was consistently classified across all views, sadness and fear were classified best from the left and right profile views, respectively.
Thus, spontaneous emotional facial expressions are uniquely expressed and shows discrimination potential from the frontal, profile left and right facial views. This research indicates relevance for real world applications such as in security surveillance to enhance detection of signs of emotion, associated with behaviours intent in causing harm. In particular where facial orientation relative to the surveillance camera location and the face are not necessary in-plane with full frontal view.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer





