Emotion Recognition via Facial Expression: Utilization of Numerous Feature Descriptors in Different Machine Learning Algorithms

Document Type

Conference Proceeding

Publication Date

2019

Abstract

Emotion Recognition has been a prominent study even before computers had the same computing power as of today. Human's emotions can be recognized through their body language, behavior and, most evidently, from the facial expression of the person. In facial image classification, each facial image can be represented through feature descriptors. Feature descriptors are simplified representations of the facial image that incorporates the essential key facial features. This study determines which feature descriptor will best fit a respective machine learning algorithm to classify facial expressions. Twelve possible combinations of Key Facial Detection, Saliency Mapping, Local Binary Pattern, and Histogram of Oriented Gradient are investigated together with six machine learning classification algorithms thus generating a total of seventy-two models. These will classify the following emotions: anger, disgust, fear, joy, neutral, sadness and surprise. A stratified ten-fold cross-validation is performed for verification on both the CK+ dataset and the locally gathered dataset for "in the wild" image testing. This study has determined that among the seventy-two models, the RBF SVM HOG+LBP model attained the highest average accuracy of 0.94 across the seven emotions with an F1 score of 0.93.

Share

COinS