Automatic emotional stimulus identification from facial expressions

Vered Aharonson, Nadav Nehmadi, Hagit Messer

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Humans can usually detect emotional signs in other humans' facial expression and infer their underlying emotion. However, up to date, no satisfactory automatic method could quantitatively map facial expression attributes to real life emotional stimuli. We have developed a method which enables to statistically relate a facial expression and the stimulus that aroused it. The results show that the probability of guessing the stimulus from features extracted from facial animation points (FAPs) is higher than the prior probability alone. That is, for the first time we are able to point out on a systematic approach for automatically relating facial expressions to emotional stimulus.

Original languageEnglish
Title of host publicationProceedings of the 4th IASTED International Conference on Signal Processing, Pattern Recognition, and Applications, SPPRA 2007
Pages333-337
Number of pages5
StatePublished - 2007
Externally publishedYes
Event4th IASTED International Conference on Signal Processing, Pattern Recognition, and Applications, SPPRA 2007 - Innsbruck, Austria
Duration: 14 Feb 200716 Feb 2007

Publication series

NameProceedings of the 4th IASTED International Conference on Signal Processing, Pattern Recognition, and Applications, SPPRA 2007

Conference

Conference4th IASTED International Conference on Signal Processing, Pattern Recognition, and Applications, SPPRA 2007
Country/TerritoryAustria
CityInnsbruck
Period14/02/0716/02/07

Keywords

  • Automatic detection of emotion
  • Facial expressions
  • Image processing

Fingerprint

Dive into the research topics of 'Automatic emotional stimulus identification from facial expressions'. Together they form a unique fingerprint.

Cite this