IDEAS home Printed from https://ideas.repec.org/a/gam/jagris/v10y2020i5p170-d357914.html
   My bibliography  Save this article

Moth Detection from Pheromone Trap Images Using Deep Learning Object Detectors

Author

Listed:
  • Suk-Ju Hong

    (Department of Biosystems and Biomaterials Science and Engineering, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Korea)

  • Sang-Yeon Kim

    (Department of Biosystems and Biomaterials Science and Engineering, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Korea)

  • Eungchan Kim

    (Department of Biosystems and Biomaterials Science and Engineering, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Korea)

  • Chang-Hyup Lee

    (Department of Biosystems and Biomaterials Science and Engineering, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Korea)

  • Jung-Sup Lee

    (Protected Horticulture Research Institute, National Institute of Horticultural and Herbal Science, Rural Development Administration, 1425, Jinham-ro, Haman-myeon, Haman-gun, Gyeongsangnam-do 52054, Korea)

  • Dong-Soo Lee

    (Department of Leaders in Industry-University Cooperation, Chung-Ang University, 4726, Seodong-daero, Daedeok-myeon, Anseong-si, Gyeonggi-do 17546, Korea)

  • Jiwoong Bang

    (Protected Horticulture Research Institute, National Institute of Horticultural and Herbal Science, Rural Development Administration, 1425, Jinham-ro, Haman-myeon, Haman-gun, Gyeongsangnam-do 52054, Korea)

  • Ghiseok Kim

    (Department of Biosystems and Biomaterials Science and Engineering, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Korea
    Research Institute of Agriculture and Life Sciences, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Korea)

Abstract

Diverse pheromones and pheromone-based traps, as well as images acquired from insects captured by pheromone-based traps, have been studied and developed to monitor the presence and abundance of pests and to protect plants. The purpose of this study is to construct models that detect three species of pest moths in pheromone trap images using deep learning object detection methods and compare their speed and accuracy. Moth images in pheromone traps were collected for training and evaluation of deep learning detectors. Collected images were then subjected to a labeling process that defines the ground truths of target objects for their box locations and classes. Because there were a few negative objects in the dataset, non-target insects were labeled as unknown class and images of non-target insects were added to the dataset. Moreover, data augmentation methods were applied to the training process, and parameters of detectors that were pre-trained with the COCO dataset were used as initial parameter values. Seven detectors—Faster R-CNN ResNet 101, Faster R-CNN ResNet 50, Faster R-CNN Inception v.2, R-FCN ResNet 101, Retinanet ResNet 50, Retinanet Mobile v.2, and SSD Inception v.2 were trained and evaluated. Faster R-CNN ResNet 101 detector exhibited the highest accuracy (mAP as 90.25), and seven different detector types showed different accuracy and speed. Furthermore, when unexpected insects were included in the collected images, a four-class detector with an unknown class (non-target insect) showed lower detection error than a three-class detector.

Suggested Citation

  • Suk-Ju Hong & Sang-Yeon Kim & Eungchan Kim & Chang-Hyup Lee & Jung-Sup Lee & Dong-Soo Lee & Jiwoong Bang & Ghiseok Kim, 2020. "Moth Detection from Pheromone Trap Images Using Deep Learning Object Detectors," Agriculture, MDPI, vol. 10(5), pages 1-12, May.
  • Handle: RePEc:gam:jagris:v:10:y:2020:i:5:p:170-:d:357914
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2077-0472/10/5/170/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2077-0472/10/5/170/
    Download Restriction: no
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Peng Wang & Jiang Liu & Lijia Xu & Peng Huang & Xiong Luo & Yan Hu & Zhiliang Kang, 2021. "Classification of Amanita Species Based on Bilinear Networks with Attention Mechanism," Agriculture, MDPI, vol. 11(5), pages 1-13, April.
    2. Jozsef Suto, 2022. "Codling Moth Monitoring with Camera-Equipped Automated Traps: A Review," Agriculture, MDPI, vol. 12(10), pages 1-18, October.
    3. Saim Khalid & Hadi Mohsen Oqaibi & Muhammad Aqib & Yaser Hafeez, 2023. "Small Pests Detection in Field Crops Using Deep Learning Object Detection," Sustainability, MDPI, vol. 15(8), pages 1-19, April.
    4. Dana Čirjak & Ivan Aleksi & Darija Lemic & Ivana Pajač Živković, 2023. "EfficientDet-4 Deep Neural Network-Based Remote Monitoring of Codling Moth Population for Early Damage Detection in Apple Orchard," Agriculture, MDPI, vol. 13(5), pages 1-20, April.
    5. Renjie Huang & Tingshan Yao & Cheng Zhan & Geng Zhang & Yongqiang Zheng, 2021. "A Motor-Driven and Computer Vision-Based Intelligent E-Trap for Monitoring Citrus Flies," Agriculture, MDPI, vol. 11(5), pages 1-27, May.
    6. Jozsef Suto, 2022. "A Novel Plug-in Board for Remote Insect Monitoring," Agriculture, MDPI, vol. 12(11), pages 1-16, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jagris:v:10:y:2020:i:5:p:170-:d:357914. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.