A novel automated rodent tracker (ART), demonstrated in a mouse model of amyotrophic lateral sclerosis

Brett M Hewitt, Moi Hoon Yap, Emma F Hodson-Tole, Aneurin J Kennerley, Paul S Sharp, Robyn A Grant

Research output: Contribution to journalArticlepeer-review


BACKGROUND: Generating quantitative metrics of rodent locomotion and general behaviours from video footage is important in behavioural neuroscience studies. However, there is not yet a free software system that can process large amounts of video data with minimal user interventions.

NEW METHOD: Here we propose a new, automated rodent tracker (ART) that uses a simple rule-based system to quickly and robustly track rodent nose and body points, with minimal user input. Tracked points can then be used to identify behaviours, approximate body size and provide locomotion metrics, such as speed and distance.

RESULTS: ART was demonstrated here on video recordings of a SOD1 mouse model, of amyotrophic lateral sclerosis, aged 30, 60, 90 and 120days. Results showed a robust decline in locomotion speeds, as well as a reduction in object exploration and forward movement, with an increase in the time spent still. Body size approximations (centroid width), showed a significant decrease from P30.

COMPARISON WITH EXISTING METHOD(S): ART performed to a very similar accuracy as manual tracking and Ethovision (a commercially available alternative), with average differences in coordinate points of 0.6 and 0.8mm, respectively. However, it required much less user intervention than Ethovision (6 as opposed to 30 mouse clicks) and worked robustly over more videos.

CONCLUSIONS: ART provides an open-source option for behavioural analysis of rodents, performing to the same standards as commercially available software. It can be considered a validated, and accessible, alternative for researchers for whom non-invasive quantification of natural rodent behaviour is desirable.

Original languageEnglish
Pages (from-to)1-10
Number of pages10
JournalJournal of Neuroscience Methods
Early online date13 Apr 2017
Publication statusE-pub ahead of print - 13 Apr 2017


  • Journal Article

Cite this