By the same authors

From the same journal

From the same journal

Human listeners' perception of behavioural context and core affect dimensions in chimpanzee vocalizations

Research output: Contribution to journalArticlepeer-review

Full text download(s)

Published copy (DOI)



Publication details

JournalProceedings of the Royal Society B: Biological Sciences
DateAccepted/In press - 27 May 2020
DateE-pub ahead of print - 17 Jun 2020
DatePublished (current) - 24 Jun 2020
Issue number1929
Number of pages10
Early online date17/06/20
Original languageEnglish


Vocalizations linked to emotional states are partly conserved among phylogenetically related species. This continuity may allow humans to accurately infer affective information from vocalizations produced by chimpanzees. In two pre-registered experiments, we examine human listeners' ability to infer behavioural contexts (e.g. discovering food) and core affect dimensions (arousal and valence) from 155 vocalizations produced by 66 chimpanzees in 10 different positive and negative contexts at high, medium or low arousal levels. In experiment 1, listeners (n = 310), categorized the vocalizations in a forced-choice task with 10 response options, and rated arousal and valence. In experiment 2, participants (n = 3120) matched vocalizations to production contexts using yes/no response options. The results show that listeners were accurate at matching vocalizations of most contexts in addition to inferring arousal and valence. Judgments were more accurate for negative as compared to positive vocalizations. An acoustic analysis demonstrated that, listeners made use of brightness and duration cues, and relied on noisiness in making context judgements, and pitch to infer core affect dimensions. Overall, the results suggest that human listeners can infer affective information from chimpanzee vocalizations beyond core affect, indicating phylogenetic continuity in the mapping of vocalizations to behavioural contexts.

Bibliographical note

© 2020 The Authors.

Discover related content

Find related publications, people, projects, datasets and more using interactive charts.

View graph of relations