IDEAS home Printed from https://ideas.repec.org/a/gam/jagris/v15y2025i8p827-d1632250.html
   My bibliography  Save this article

Enhancing Autonomous Orchard Navigation: A Real-Time Convolutional Neural Network-Based Obstacle Classification System for Distinguishing ‘Real’ and ‘Fake’ Obstacles in Agricultural Robotics

Author

Listed:
  • Tabinda Naz Syed

    (College of Engineering, Nanjing Agricultural University, Nanjing 210095, China)

  • Jun Zhou

    (College of Engineering, Nanjing Agricultural University, Nanjing 210095, China)

  • Imran Ali Lakhiar

    (Research Center of Fluid Machinery Engineering and Technology, Jiangsu University, Zhenjiang 212013, China)

  • Francesco Marinello

    (Department of Land, Environment, Agriculture and Forestry, University of Padova, 35020 Legnaro, Italy)

  • Tamiru Tesfaye Gemechu

    (College of Engineering, Nanjing Agricultural University, Nanjing 210095, China)

  • Luke Toroitich Rottok

    (College of Engineering, Nanjing Agricultural University, Nanjing 210095, China)

  • Zhizhen Jiang

    (College of Engineering, Nanjing Agricultural University, Nanjing 210095, China)

Abstract

Autonomous navigation in agricultural environments requires precise obstacle classification to ensure collision-free movement. This study proposes a convolutional neural network (CNN)-based model designed to enhance obstacle classification for agricultural robots, particularly in orchards. Building upon a previously developed YOLOv8n-based real-time detection system, the model incorporates Ghost Modules and Squeeze-and-Excitation (SE) blocks to enhance feature extraction while maintaining computational efficiency. Obstacles are categorized as “Real”—those that physically impact navigation, such as tree trunks and persons—and “Fake”—those that do not, such as tall weeds and tree branches—allowing for precise navigation decisions. The model was trained on separate orchard and campus datasets and fine-tuned using Hyperband optimization and evaluated on an external test set to assess generalization to unseen obstacles. The model’s robustness was tested under varied lighting conditions, including low-light scenarios, to ensure real-world applicability. Computational efficiency was analyzed based on inference speed, memory consumption, and hardware requirements. Comparative analysis against state-of-the-art classification models (VGG16, ResNet50, MobileNetV3, DenseNet121, EfficientNetB0, and InceptionV3) confirmed the proposed model’s superior precision ( p ), recall ( r ), and F1-score, particularly in complex orchard scenarios. The model maintained strong generalization across diverse environmental conditions, including varying illumination and previously unseen obstacles. Furthermore, computational analysis revealed that the orchard-combined model achieved the highest inference speed at 2.31 FPS while maintaining a strong balance between accuracy and efficiency. When deployed in real-time, the model achieved 95.0% classification accuracy in orchards and 92.0% in campus environments. The real-time system demonstrated a false positive rate of 8.0% in the campus environment and 2.0% in the orchard, with a consistent false negative rate of 8.0% across both environments. These results validate the model’s effectiveness for real-time obstacle differentiation in agricultural settings. Its strong generalization, robustness to unseen obstacles, and computational efficiency make it well-suited for deployment in precision agriculture. Future work will focus on enhancing inference speed, improving performance under occlusion, and expanding dataset diversity to further strengthen real-world applicability.

Suggested Citation

  • Tabinda Naz Syed & Jun Zhou & Imran Ali Lakhiar & Francesco Marinello & Tamiru Tesfaye Gemechu & Luke Toroitich Rottok & Zhizhen Jiang, 2025. "Enhancing Autonomous Orchard Navigation: A Real-Time Convolutional Neural Network-Based Obstacle Classification System for Distinguishing ‘Real’ and ‘Fake’ Obstacles in Agricultural Robotics," Agriculture, MDPI, vol. 15(8), pages 1-30, April.
  • Handle: RePEc:gam:jagris:v:15:y:2025:i:8:p:827-:d:1632250
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2077-0472/15/8/827/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2077-0472/15/8/827/
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jagris:v:15:y:2025:i:8:p:827-:d:1632250. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.