Open-source 3D video based behavioral analysis system for neuroscience

Subscribe to receive news and updates for 3DTracker-FAB here.

Looking for more open-source tools for neuroscience? Search in Open Ephys and OpenBehavior

Project

The goal of this open-source project is to facilitate applications of 3D video analysis, by developing and sharing an open-source, low cost and versatile 3D video analysis system, named 3DTracker-FAB. The system is based on a system developed previously in Nishijo lab, which can analyze behavior of rodents and monkeys in 3D, without attaching any marker on the animals. The software will be free and the total cost for devices for the system will be < 3,000 dollars. We are also preparing for rich documentations and online discussion forum for facilitating improvements and applications.

 

We are calling for contributors (both of users and developers) to achieve the goal!

 

Why 3D?

Three-dimensional video based behavioral analyses greatly contribute to various experiments and analyses, which have been difficult in 2D video-based analyses. For example,

  • Robust tracking of overlapped animals (e.g., social interaction)
  • Estimation of detailed 3D postures

 

Why markerless?

Markers (or paint) on an animal’s body often disturbs naturalistic behavior expression. By making the system markerless, we can analyze: emotional postures, social interaction, and any behavioral tests which don’t allow long habituation/training for the markers.

 

How it works

  1. A 3D video is reconstructed from the images of multiple (~4) depth cameras.
  2. Positions of body parts are estimated by fitting skeletal models to the 3D videos.
  3. Various behaviors are automatically detected based on the trajectories of body parts.

 

Project roadmap

  1. Launching this website for getting attention and suggestion from the early stage.
  2. Closed-source development to build basic functions and to cleaning the codes.
  3. Publishing the software for mice and rats, as an open-source software. (in 2017)
  4. Publishing the user manual for the system and launching online discussion forum. (in 2017) <-- Finished till here
  5. Further improvement of the software (e.g., versions for monkeys and other animal species, scripts for post-processing of the 3D trajectory data, synchronization with the other devices, and real-time behavioral detection) according to the discussion with users and additional developers. (from 2018)

 

Publications

  1. Nakamura T, Matsumoto J, Nishimaru H, Bretas R, Takamura Y, Hori E, Ono T, Nishijo H. (2016) A markerless 3D computerized motion capture system incorporating a skeleton model for monkeys. PLoS One 11:e0166154 [PubMed]
  2. Matsumoto J, Nishimaru H, Takamura Y, Urakawa S, Ono T, Nishijo H. (2016) Amygdalar auditory neurons contribute to self-other distinction during ultrasonic social vocalization in rats. Front Neurosci 10:399 [PubMed]
  3. Matsumoto J, Uehara T, Urakawa S, Takamura Y, Sumiyoshi T, Suzuki M, Ono T, Nishijo H. (2014) 3D video analysis of the novel object recognition test in rats. Behav Brain Res 272:16-24 [PubMed]
  4. Matsumoto J, Urakawa S, Takamura Y, Malcher-Lopes R, Hori E, Tomaz C, Ono T, and Nishijo H (2013) A 3D-video-based computerized analysis of social and sexual interactions in rats. PLoS One 8:e7846 [PubMed]