IDEAS home Printed from https://ideas.repec.org/a/taf/tprsxx/v61y2023i13p4220-4236.html
   My bibliography  Save this article

A knowledge augmented image deblurring method with deep learning for in-situ quality detection of yarn production

Author

Listed:
  • Chuqiao Xu
  • Junliang Wang
  • Jing Tao
  • Jie Zhang
  • Ray Y. Zhong

Abstract

In the in-situ quality detection of yarn production, image deblurring plays a critical role in the vision-based detection systems to restore a sharp image and provide more accurate input for inspection. However, image deblurring is still challenging since the current methods are mainly based on the pre-defined blur degree. In dynamic yarn production, the relationship between the defocus blur degrees and the poses of the yarn body is highly associated, which can be excavated to prior knowledge in image deblurring to achieve more effective restoration. Thus, a knowledge augmented deep learning model is proposed to adaptively deblur yarn images with variable defocus blur degrees. A pose classification module designed by prior knowledge is embedded into the deep neural network, which classifies the yarn poses and feeds them into multi-scale deblurring channels. In each channel, we incorporate the image gradient prior into the specially designed loss function to attract the attention of the deblurring network on the edge details of the yarn. The experimental results from actual spinning processes demonstrate that the proposed method performs a better effect not only in the variable-scale deblurring of the global image but also in the restoration of the edge details.

Suggested Citation

  • Chuqiao Xu & Junliang Wang & Jing Tao & Jie Zhang & Ray Y. Zhong, 2023. "A knowledge augmented image deblurring method with deep learning for in-situ quality detection of yarn production," International Journal of Production Research, Taylor & Francis Journals, vol. 61(13), pages 4220-4236, July.
  • Handle: RePEc:taf:tprsxx:v:61:y:2023:i:13:p:4220-4236
    DOI: 10.1080/00207543.2021.2010827
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1080/00207543.2021.2010827
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1080/00207543.2021.2010827?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:taf:tprsxx:v:61:y:2023:i:13:p:4220-4236. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Longhurst (email available below). General contact details of provider: http://www.tandfonline.com/TPRS20 .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.