IDEAS home Printed from https://ideas.repec.org/a/hin/jnlmpe/8998743.html
   My bibliography  Save this article

Dual-Channel and Two-Stage Dehazing Network for Promoting Ship Detection in Visual Perception System

Author

Listed:
  • Ting Liu
  • Baijun Zhou
  • Bekir Sahin

Abstract

Maritime video surveillance of visual perception system has become an essential method to guarantee unmanned surface vessels (USV) traffic safety and security in maritime applications. However, when visual data are collected in a foggy marine environment, the essential optical information is often hidden in the fog, potentially resulting in decreased accuracy of ship detection. Therefore, a dual-channel and two-stage dehazing network (DTDNet) is proposed to improve the clarity and quality of the image to guarantee reliable ship detection under foggy conditions. Specifically, an upper and lower sampling structure is introduced to expand the original two-stage dehazing network into a two-channel network, to further capture the image features from different scale. Meanwhile, the attention mechanism is combined to provide different weights for different feature maps to maintain more image information. Furthermore, the perceptual function is constructed with the MSE-based loss function, so that it can better reduce the gap between the dehazing image and the unhazy image. Extensive experiments show that DTDNet has a better dehazing performance on both visual effects and quantitative index than other state-of-the-art dehazing networks. Moreover, the dehazing network is combined with the problem of ship detection under a sea-fog environment, and experiment results demonstrate that our network can be effectively applied to improve the visual perception performance of USV.

Suggested Citation

  • Ting Liu & Baijun Zhou & Bekir Sahin, 2022. "Dual-Channel and Two-Stage Dehazing Network for Promoting Ship Detection in Visual Perception System," Mathematical Problems in Engineering, Hindawi, vol. 2022, pages 1-15, May.
  • Handle: RePEc:hin:jnlmpe:8998743
    DOI: 10.1155/2022/8998743
    as

    Download full text from publisher

    File URL: http://downloads.hindawi.com/journals/mpe/2022/8998743.pdf
    Download Restriction: no

    File URL: http://downloads.hindawi.com/journals/mpe/2022/8998743.xml
    Download Restriction: no

    File URL: https://libkey.io/10.1155/2022/8998743?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hin:jnlmpe:8998743. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Mohamed Abdelhakeem (email available below). General contact details of provider: https://www.hindawi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.