IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v15y2024i1d10.1038_s41467-024-48792-2.html
   My bibliography  Save this article

SuperAnimal pretrained pose estimation models for behavioral analysis

Author

Listed:
  • Shaokai Ye

    (Brain Mind Institute & Neuro-X Institute)

  • Anastasiia Filippova

    (Brain Mind Institute & Neuro-X Institute)

  • Jessy Lauer

    (Brain Mind Institute & Neuro-X Institute)

  • Steffen Schneider

    (Brain Mind Institute & Neuro-X Institute)

  • Maxime Vidal

    (Brain Mind Institute & Neuro-X Institute)

  • Tian Qiu

    (Brain Mind Institute & Neuro-X Institute)

  • Alexander Mathis

    (Brain Mind Institute & Neuro-X Institute)

  • Mackenzie Weygandt Mathis

    (Brain Mind Institute & Neuro-X Institute)

Abstract

Quantification of behavior is critical in diverse applications from neuroscience, veterinary medicine to animal conservation. A common key step for behavioral analysis is first extracting relevant keypoints on animals, known as pose estimation. However, reliable inference of poses currently requires domain knowledge and manual labeling effort to build supervised models. We present SuperAnimal, a method to develop unified foundation models that can be used on over 45 species, without additional manual labels. These models show excellent performance across six pose estimation benchmarks. We demonstrate how to fine-tune the models (if needed) on differently labeled data and provide tooling for unsupervised video adaptation to boost performance and decrease jitter across frames. If fine-tuned, SuperAnimal models are 10–100× more data efficient than prior transfer-learning-based approaches. We illustrate the utility of our models in behavioral classification and kinematic analysis. Collectively, we present a data-efficient solution for animal pose estimation.

Suggested Citation

  • Shaokai Ye & Anastasiia Filippova & Jessy Lauer & Steffen Schneider & Maxime Vidal & Tian Qiu & Alexander Mathis & Mackenzie Weygandt Mathis, 2024. "SuperAnimal pretrained pose estimation models for behavioral analysis," Nature Communications, Nature, vol. 15(1), pages 1-19, December.
  • Handle: RePEc:nat:natcom:v:15:y:2024:i:1:d:10.1038_s41467-024-48792-2
    DOI: 10.1038/s41467-024-48792-2
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-024-48792-2
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-024-48792-2?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Steffen Schneider & Jin Hwa Lee & Mackenzie Weygandt Mathis, 2023. "Learnable latent embeddings for joint behavioural and neural analysis," Nature, Nature, vol. 617(7960), pages 360-368, May.
    2. Praneet C. Bala & Benjamin R. Eisenreich & Seng Bum Michael Yoo & Benjamin Y. Hayden & Hyun Soo Park & Jan Zimmermann, 2020. "Automated markerless pose estimation in freely moving macaques with OpenMonkeyStudio," Nature Communications, Nature, vol. 11(1), pages 1-12, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Liang An & Jilong Ren & Tao Yu & Tang Hai & Yichang Jia & Yebin Liu, 2023. "Three-dimensional surface motion capture of multiple freely moving pigs using MAMMAL," Nature Communications, Nature, vol. 14(1), pages 1-14, December.
    2. Ana M. G. Manea & David J.-N. Maisson & Benjamin Voloh & Anna Zilverstand & Benjamin Hayden & Jan Zimmermann, 2024. "Neural timescales reflect behavioral demands in freely moving rhesus macaques," Nature Communications, Nature, vol. 15(1), pages 1-16, December.
    3. Daniel J. Butler & Alexander P. Keim & Shantanu Ray & Eiman Azim, 2023. "Large-scale capture of hidden fluorescent labels for training generalizable markerless motion capture models," Nature Communications, Nature, vol. 14(1), pages 1-16, December.
    4. Guihua Xiao & Yeyi Cai & Yuanlong Zhang & Jingyu Xie & Lifan Wu & Hao Xie & Jiamin Wu & Qionghai Dai, 2024. "Mesoscale neuronal granular trial variability in vivo illustrated by nonlinear recurrent network in silico," Nature Communications, Nature, vol. 15(1), pages 1-16, December.
    5. Erik Hermansen & David A. Klindt & Benjamin A. Dunn, 2024. "Uncovering 2-D toroidal representations in grid cell ensemble activity during 1-D behavior," Nature Communications, Nature, vol. 15(1), pages 1-11, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:15:y:2024:i:1:d:10.1038_s41467-024-48792-2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.