IDEAS home Printed from https://ideas.repec.org/a/nat/nature/v591y2021i7849d10.1038_s41586-020-03152-0.html
   My bibliography  Save this article

Towards real-time photorealistic 3D holography with deep neural networks

Author

Listed:
  • Liang Shi

    (Massachusetts Institute of Technology
    Massachusetts Institute of Technology)

  • Beichen Li

    (Massachusetts Institute of Technology
    Massachusetts Institute of Technology)

  • Changil Kim

    (Massachusetts Institute of Technology
    Massachusetts Institute of Technology)

  • Petr Kellnhofer

    (Massachusetts Institute of Technology
    Massachusetts Institute of Technology)

  • Wojciech Matusik

    (Massachusetts Institute of Technology
    Massachusetts Institute of Technology)

Abstract

The ability to present three-dimensional (3D) scenes with continuous depth sensation has a profound impact on virtual and augmented reality, human–computer interaction, education and training. Computer-generated holography (CGH) enables high-spatio-angular-resolution 3D projection via numerical simulation of diffraction and interference1. Yet, existing physically based methods fail to produce holograms with both per-pixel focal control and accurate occlusion2,3. The computationally taxing Fresnel diffraction simulation further places an explicit trade-off between image quality and runtime, making dynamic holography impractical4. Here we demonstrate a deep-learning-based CGH pipeline capable of synthesizing a photorealistic colour 3D hologram from a single RGB-depth image in real time. Our convolutional neural network (CNN) is extremely memory efficient (below 620 kilobytes) and runs at 60 hertz for a resolution of 1,920 × 1,080 pixels on a single consumer-grade graphics processing unit. Leveraging low-power on-device artificial intelligence acceleration chips, our CNN also runs interactively on mobile (iPhone 11 Pro at 1.1 hertz) and edge (Google Edge TPU at 2.0 hertz) devices, promising real-time performance in future-generation virtual and augmented-reality mobile headsets. We enable this pipeline by introducing a large-scale CGH dataset (MIT-CGH-4K) with 4,000 pairs of RGB-depth images and corresponding 3D holograms. Our CNN is trained with differentiable wave-based loss functions5 and physically approximates Fresnel diffraction. With an anti-aliasing phase-only encoding method, we experimentally demonstrate speckle-free, natural-looking, high-resolution 3D holograms. Our learning-based approach and the Fresnel hologram dataset will help to unlock the full potential of holography and enable applications in metasurface design6,7, optical and acoustic tweezer-based microscopic manipulation8–10, holographic microscopy11 and single-exposure volumetric 3D printing12,13.

Suggested Citation

  • Liang Shi & Beichen Li & Changil Kim & Petr Kellnhofer & Wojciech Matusik, 2021. "Towards real-time photorealistic 3D holography with deep neural networks," Nature, Nature, vol. 591(7849), pages 234-239, March.
  • Handle: RePEc:nat:nature:v:591:y:2021:i:7849:d:10.1038_s41586-020-03152-0
    DOI: 10.1038/s41586-020-03152-0
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41586-020-03152-0
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1038/s41586-020-03152-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. M. Makowski & J. Bomba & A. Frej & M. Kolodziejczyk & M. Sypek & T. Shimobaba & T. Ito & A. Kirilyuk & A. Stupakiewicz, 2022. "Dynamic complex opto-magnetic holography," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    2. Gong, Bin & An, Aimin & Shi, Yaoke & Zhang, Xuemin, 2024. "Fast fault detection method for photovoltaic arrays with adaptive deep multiscale feature enhancement," Applied Energy, Elsevier, vol. 353(PA).
    3. Changwon Jang & Kiseung Bang & Minseok Chae & Byoungho Lee & Douglas Lanman, 2024. "Waveguide holography for 3D augmented reality glasses," Nature Communications, Nature, vol. 15(1), pages 1-12, December.
    4. Hyeonseung Yu & Youngrok Kim & Daeho Yang & Wontaek Seo & Yunhee Kim & Jong-Young Hong & Hoon Song & Geeyoung Sung & Younghun Sung & Sung-Wook Min & Hong-Seok Lee, 2023. "Deep learning-based incoherent holographic camera enabling acquisition of real-world holograms for holographic streaming system," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
    5. Zijian Shi & Zhensong Wan & Ziyu Zhan & Kaige Liu & Qiang Liu & Xing Fu, 2023. "Super-resolution orbital angular momentum holography," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
    6. Daeho Yang & Wontaek Seo & Hyeonseung Yu & Sun Il Kim & Bongsu Shin & Chang-Kun Lee & Seokil Moon & Jungkwuen An & Jong-Young Hong & Geeyoung Sung & Hong-Seok Lee, 2022. "Diffraction-engineered holography: Beyond the depth representation limit of holographic displays," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    7. Ethan Tseng & Grace Kuo & Seung-Hwan Baek & Nathan Matsuda & Andrew Maimone & Florian Schiffers & Praneeth Chakravarthula & Qiang Fu & Wolfgang Heidrich & Douglas Lanman & Felix Heide, 2024. "Neural étendue expander for ultra-wide-angle high-fidelity holographic display," Nature Communications, Nature, vol. 15(1), pages 1-8, December.
    8. Pengcheng Chen & Xiaoyi Xu & Tianxin Wang & Chao Zhou & Dunzhao Wei & Jianan Ma & Junjie Guo & Xuejing Cui & Xiaoyan Cheng & Chenzhu Xie & Shuang Zhang & Shining Zhu & Min Xiao & Yong Zhang, 2023. "Laser nanoprinting of 3D nonlinear holograms beyond 25000 pixels-per-inch for inter-wavelength-band information processing," Nature Communications, Nature, vol. 14(1), pages 1-9, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:nature:v:591:y:2021:i:7849:d:10.1038_s41586-020-03152-0. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.