IDEAS home Printed from https://ideas.repec.org/a/gam/jagris/v14y2024i7p971-d1419904.html
   My bibliography  Save this article

Multi-Feature Fusion Recognition and Localization Method for Unmanned Harvesting of Aquatic Vegetables

Author

Listed:
  • Xianping Guan

    (Key Laboratory of Modern Agricultural Equipment and Technology, Ministry of Education, Jiangsu University, Zhenjiang 212013, China
    School of Agricultural Engineering, Jiangsu University, Zhenjiang 212013, China)

  • Longyuan Shi

    (Key Laboratory of Modern Agricultural Equipment and Technology, Ministry of Education, Jiangsu University, Zhenjiang 212013, China
    School of Agricultural Engineering, Jiangsu University, Zhenjiang 212013, China)

  • Weiguang Yang

    (Key Laboratory of Modern Agricultural Equipment and Technology, Ministry of Education, Jiangsu University, Zhenjiang 212013, China
    School of Agricultural Engineering, Jiangsu University, Zhenjiang 212013, China)

  • Hongrui Ge

    (Key Laboratory of Modern Agricultural Equipment and Technology, Ministry of Education, Jiangsu University, Zhenjiang 212013, China
    School of Agricultural Engineering, Jiangsu University, Zhenjiang 212013, China)

  • Xinhua Wei

    (Key Laboratory of Modern Agricultural Equipment and Technology, Ministry of Education, Jiangsu University, Zhenjiang 212013, China
    School of Agricultural Engineering, Jiangsu University, Zhenjiang 212013, China)

  • Yuhan Ding

    (School of Electrical and Information Engineering, Jiangsu University, Zhenjiang 212013, China)

Abstract

The vision-based recognition and localization system plays a crucial role in the unmanned harvesting of aquatic vegetables. After field investigation, factors such as illumination, shading, and computational cost have become the main difficulties restricting the identification and positioning of Brasenia schreberi . Therefore, this paper proposes a new lightweight detection method, YOLO-GS, which integrates feature information from both RGB and depth images for recognition and localization tasks. YOLO-GS employs the Ghost convolution module as a replacement for traditional convolution and innovatively introduces the C3-GS, a cross-stage module, to effectively reduce parameters and computational costs. With the redesigned detection head structure, its feature extraction capability in complex environments has been significantly enhanced. Moreover, the model utilizes Focal EIoU as the regression loss function to mitigate the adverse effects of low-quality samples on gradients. We have developed a data set of Brasenia schreberi that covers various complex scenarios, comprising a total of 1500 images. The YOLO-GS model, trained on this dataset, achieves an average accuracy of 95.7%. The model size is 7.95 MB, with 3.75 M parameters and a 9.5 GFLOPS computational cost. Compared to the original YOLOv5s model, YOLO-GS improves recognition accuracy by 2.8%, reduces the model size and parameter number by 43.6% and 46.5%, and offers a 39.9% reduction in computational requirements. Furthermore, the positioning errors of picking points are less than 5.01 mm in the X direction, 3.65 mm in the Y direction, and 1.79 mm in the Z direction. As a result, YOLO-GS not only excels with high recognition accuracy but also exhibits low computational demands, enabling precise target identification and localization in complex environments so as to meet the requirements of real-time harvesting tasks.

Suggested Citation

  • Xianping Guan & Longyuan Shi & Weiguang Yang & Hongrui Ge & Xinhua Wei & Yuhan Ding, 2024. "Multi-Feature Fusion Recognition and Localization Method for Unmanned Harvesting of Aquatic Vegetables," Agriculture, MDPI, vol. 14(7), pages 1-25, June.
  • Handle: RePEc:gam:jagris:v:14:y:2024:i:7:p:971-:d:1419904
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2077-0472/14/7/971/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2077-0472/14/7/971/
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jagris:v:14:y:2024:i:7:p:971-:d:1419904. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.