IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v12y2024i19p3039-d1488184.html
   My bibliography  Save this article

Full-Body Pose Estimation of Humanoid Robots Using Head-Worn Cameras for Digital Human-Augmented Robotic Telepresence

Author

Listed:
  • Youngdae Cho

    (Department of Metaverse Convergence, Graduate School, Konkuk University, Seoul 05029, Republic of Korea)

  • Wooram Son

    (Department of Metaverse Convergence, Graduate School, Konkuk University, Seoul 05029, Republic of Korea)

  • Jaewan Bak

    (Center for Intelligent and Interactive Robotics, Korea Institute of Science and Technology, Seoul 02792, Republic of Korea)

  • Yisoo Lee

    (Center for Intelligent and Interactive Robotics, Korea Institute of Science and Technology, Seoul 02792, Republic of Korea)

  • Hwasup Lim

    (Center for Artificial Intelligence, Korea Institute of Science and Technology, Seoul 02792, Republic of Korea)

  • Youngwoon Cha

    (Department of Metaverse Convergence, Graduate School, Konkuk University, Seoul 05029, Republic of Korea)

Abstract

We envision a telepresence system that enhances remote work by facilitating both physical and immersive visual interactions between individuals. However, during robot teleoperation, communication often lacks realism, as users see the robot’s body rather than the remote individual. To address this, we propose a method for overlaying a digital human model onto a humanoid robot using XR visualization, enabling an immersive 3D telepresence experience. Our approach employs a learning-based method to estimate the 2D poses of the humanoid robot from head-worn stereo views, leveraging a newly collected dataset of full-body poses for humanoid robots. The stereo 2D poses and sparse inertial measurements from the remote operator are optimized to compute 3D poses over time. The digital human is localized from the perspective of a continuously moving observer, utilizing the estimated 3D pose of the humanoid robot. Our moving camera-based pose estimation method does not rely on any markers or external knowledge of the robot’s status, effectively overcoming challenges such as marker occlusion, calibration issues, and dependencies on headset tracking errors. We demonstrate the system in a remote physical training scenario, achieving real-time performance at 40 fps, which enables simultaneous immersive and physical interactions. Experimental results show that our learning-based 3D pose estimation method, which operates without prior knowledge of the robot, significantly outperforms alternative approaches requiring the robot’s global pose, particularly during rapid headset movements, achieving markerless digital human augmentation from head-worn views.

Suggested Citation

  • Youngdae Cho & Wooram Son & Jaewan Bak & Yisoo Lee & Hwasup Lim & Youngwoon Cha, 2024. "Full-Body Pose Estimation of Humanoid Robots Using Head-Worn Cameras for Digital Human-Augmented Robotic Telepresence," Mathematics, MDPI, vol. 12(19), pages 1-27, September.
  • Handle: RePEc:gam:jmathe:v:12:y:2024:i:19:p:3039-:d:1488184
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/12/19/3039/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/12/19/3039/
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:12:y:2024:i:19:p:3039-:d:1488184. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.