Rxivist logo

DeepFly3D: A deep learning-based approach for 3D limb and appendage tracking in tethered, adult Drosophila

By Semih Günel, Helge Rhodin, Daniel Morales, João Campagnolo, Pavan Ramdya, Pascal Fua

Posted 20 May 2019
bioRxiv DOI: 10.1101/640375 (published DOI: 10.7554/eLife.48571)

Studying how neural circuits orchestrate limbed behaviors requires the precise measurement of the positions of each appendage in 3-dimensional (3D) space. Deep neural networks can estimate 2-dimensional (2D) pose in freely behaving and tethered animals. However, the unique challenges associated with transforming these 2D measurements into reliable and precise 3D poses have not been addressed for small animals including the fly, Drosophila melanogaster . Here we present DeepFly3D, a software that infers the 3D pose of tethered, adult Drosophila —or other animals—using multiple camera images. DeepFly3D does not require manual calibration, uses pictorial structures to automatically detect and correct pose estimation errors, and uses active learning to iteratively improve performance. We demonstrate more accurate unsupervised behavioral embedding using 3D joint angles rather than commonly used 2D pose data. Thus, DeepFly3D enables the automated acquisition of behavioral measurements at an unprecedented level of resolution for a variety of biological applications.

Download data

  • Downloaded 1,492 times
  • Download rankings, all-time:
    • Site-wide: 13,195
    • In animal behavior and cognition: 93
  • Year to date:
    • Site-wide: 74,284
  • Since beginning of last month:
    • Site-wide: 72,026

Altmetric data

Downloads over time

Distribution of downloads per paper, site-wide