Coherent emotional perception from body expressions and the voice

Pei-Wen Yeh, Elena Geangu, Vincent Reid

Research output: Contribution to journalArticlepeer-review

Abstract

Perceiving emotion from multiple modalities enhances the perceptual sensitivity of an individual. This allows more accurate judgments of others’ emotional states, which is crucial to appropriate social interactions. It is known that body expressions effectively convey emotional messages, although fewer studies have examined how this information is combined with the auditory cues. The present study used event-related potentials (ERP) to investigate the interaction between emotional body expressions and vocalizations. We also examined emotional congruency between auditory and visual information to determine how preceding visual context influences later auditory processing. Consistent with prior findings, a reduced N1 amplitude was observed in the audiovisual condition compared to an auditory-only condition. While this component was not sensitive to the modality congruency, the P2 was sensitive to the emotionally incompatible audiovisual pairs. Further, the direction of these congruency effects was different in terms of facilitation or suppression based on the preceding contexts. Overall, the results indicate a functionally dissociated mechanism underlying two stages of emotional processing whereby N1 is involved in cross-modal processing, whereas P2 is related to assessing a unifying perceptual content. These data also indicate that emotion integration can be affected by the specific emotion that is presented.
Original languageEnglish
Pages (from-to)99-108
Number of pages10
JournalNeuropsychologia
Volume91
Early online date30 Jul 2016
DOIs
Publication statusPublished - Oct 2016

Bibliographical note

© 2016 Elsevier Ltd.

Keywords

  • Audiovisual processing
  • Body expressions
  • Congruency
  • Cross-modal prediction
  • EEG/ERP
  • Emotion

Cite this