By the same authors

From the same journal

From the same journal

Gender Perception From Gait: A Comparison Between Biological, Biomimetic and Non-biomimetic Learning Paradigms

Research output: Contribution to journalArticlepeer-review

Full text download(s)

Published copy (DOI)

Author(s)

Department/unit(s)

Publication details

JournalFrontiers in human neuroscience
DateAccepted/In press - 20 Jul 2020
DatePublished (current) - 27 Aug 2020
Volume14
Number of pages11
Original languageEnglish

Abstract

This paper explores in parallel the underlying mechanisms in human perception of biological motion and the best approaches for automatic classification of gait. The experiments tested three different learning paradigms, namely, biological, biomimetic, and non-biomimetic models for gender identification from human gait. Psychophysical experiments with twenty-one observers were conducted along with computational experiments without applying any gender specific modifications to the models or the stimuli. Results demonstrate the utilization of a generic memory based learning system in humans for gait perception, thus reducing ambiguity between two opposing learning systems proposed for biological motion perception. Results also support the biomimetic nature of memory based artificial neural networks (ANN) in their ability to emulate biological neural networks, as opposed to non-biomimetic models. In addition, the comparison between biological and computational learning approaches establishes a memory based biomimetic model as the best candidate for a generic artificial gait classifier (83% accuracy, p < 0.001), compared to human observers (66%, p < 0.005) or non-biomimetic models (83%, p < 0.001) while adhering to human-like sensitivity to gender identification, promising potential for application of the model in any given non-gender based gait perception objective with superhuman performance.

Bibliographical note

© 2020 Sarangi, Pelah, Hahn and Barenholtz.

Discover related content

Find related publications, people, projects, datasets and more using interactive charts.

View graph of relations