IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v15y2024i1d10.1038_s41467-024-50835-7.html
   My bibliography  Save this article

Curriculum learning for ab initio deep learned refractive optics

Author

Listed:
  • Xinge Yang

    (King Abdullah University of Science and Technology (KAUST))

  • Qiang Fu

    (King Abdullah University of Science and Technology (KAUST))

  • Wolfgang Heidrich

    (King Abdullah University of Science and Technology (KAUST))

Abstract

Deep optical optimization has recently emerged as a new paradigm for designing computational imaging systems using only the output image as the objective. However, it has been limited to either simple optical systems consisting of a single element such as a diffractive optical element or metalens, or the fine-tuning of compound lenses from good initial designs. Here we present a DeepLens design method based on curriculum learning, which is able to learn optical designs of compound lenses ab initio from randomly initialized surfaces without human intervention, therefore overcoming the need for a good initial design. We demonstrate the effectiveness of our approach by fully automatically designing both classical imaging lenses and a large field-of-view extended depth-of-field computational lens in a cellphone-style form factor, with highly aspheric surfaces and a short back focal length.

Suggested Citation

  • Xinge Yang & Qiang Fu & Wolfgang Heidrich, 2024. "Curriculum learning for ab initio deep learned refractive optics," Nature Communications, Nature, vol. 15(1), pages 1-8, December.
  • Handle: RePEc:nat:natcom:v:15:y:2024:i:1:d:10.1038_s41467-024-50835-7
    DOI: 10.1038/s41467-024-50835-7
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-024-50835-7
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-024-50835-7?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Yuanlong Zhang & Xiaofei Song & Jiachen Xie & Jing Hu & Jiawei Chen & Xiang Li & Haiyu Zhang & Qiqun Zhou & Lekang Yuan & Chui Kong & Yibing Shen & Jiamin Wu & Lu Fang & Qionghai Dai, 2023. "Large depth-of-field ultra-compact microscope by progressive optimization and deep learning," Nature Communications, Nature, vol. 14(1), pages 1-15, December.
    2. Ethan Tseng & Shane Colburn & James Whitehead & Luocheng Huang & Seung-Hwan Baek & Arka Majumdar & Felix Heide, 2021. "Neural nano-optics for high-quality thin lens imaging," Nature Communications, Nature, vol. 12(1), pages 1-7, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Corey A. Richards & Christian R. Ocier & Dajie Xie & Haibo Gao & Taylor Robertson & Lynford L. Goddard & Rasmus E. Christiansen & David G. Cahill & Paul V. Braun, 2023. "Hybrid achromatic microlenses with high numerical apertures and focusing efficiencies across the visible," Nature Communications, Nature, vol. 14(1), pages 1-11, December.
    2. Zhaoyi Li & Raphaël Pestourie & Joon-Suh Park & Yao-Wei Huang & Steven G. Johnson & Federico Capasso, 2022. "Inverse design enables large-scale high-performance meta-optics reshaping virtual reality," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    3. Qingbin Fan & Weizhu Xu & Xuemei Hu & Wenqi Zhu & Tao Yue & Cheng Zhang & Feng Yan & Lu Chen & Henri J. Lezec & Yanqing Lu & Amit Agrawal & Ting Xu, 2022. "Trilobite-inspired neural nanophotonic light-field camera with extreme depth-of-field," Nature Communications, Nature, vol. 13(1), pages 1-10, December.
    4. Yuanlong Zhang & Xiaofei Song & Jiachen Xie & Jing Hu & Jiawei Chen & Xiang Li & Haiyu Zhang & Qiqun Zhou & Lekang Yuan & Chui Kong & Yibing Shen & Jiamin Wu & Lu Fang & Qionghai Dai, 2023. "Large depth-of-field ultra-compact microscope by progressive optimization and deep learning," Nature Communications, Nature, vol. 14(1), pages 1-15, December.
    5. Gang Wu & Mohamed Abid & Mohamed Zerara & Jiung Cho & Miri Choi & Cormac Ó Coileáin & Kuan-Ming Hung & Ching-Ray Chang & Igor V. Shvets & Han-Chun Wu, 2024. "Miniaturized spectrometer with intrinsic long-term image memory," Nature Communications, Nature, vol. 15(1), pages 1-11, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:15:y:2024:i:1:d:10.1038_s41467-024-50835-7. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.