Author
Listed:
- André Silva Aguiar
(INESC TEC—INESC Technology and Science, 4200-465 Porto, Portugal
School of Science and Technology, University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal)
- Nuno Namora Monteiro
(Faculty of Engineering, University of Porto, 4200-465 Porto, Portugal)
- Filipe Neves dos Santos
(INESC TEC—INESC Technology and Science, 4200-465 Porto, Portugal)
- Eduardo J. Solteiro Pires
(INESC TEC—INESC Technology and Science, 4200-465 Porto, Portugal
School of Science and Technology, University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal)
- Daniel Silva
(INESC TEC—INESC Technology and Science, 4200-465 Porto, Portugal
School of Science and Technology, University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal)
- Armando Jorge Sousa
(INESC TEC—INESC Technology and Science, 4200-465 Porto, Portugal
Faculty of Engineering, University of Porto, 4200-465 Porto, Portugal)
- José Boaventura-Cunha
(INESC TEC—INESC Technology and Science, 4200-465 Porto, Portugal
School of Science and Technology, University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal)
Abstract
The development of robotic solutions in unstructured environments brings several challenges, mainly in developing safe and reliable navigation solutions. Agricultural environments are particularly unstructured and, therefore, challenging to the implementation of robotics. An example of this is the mountain vineyards, built-in steep slope hills, which are characterized by satellite signal blockage, terrain irregularities, harsh ground inclinations, and others. All of these factors impose the implementation of precise and reliable navigation algorithms, so that robots can operate safely. This work proposes the detection of semantic natural landmarks that are to be used in Simultaneous Localization and Mapping algorithms. Thus, Deep Learning models were trained and deployed to detect vine trunks. As significant contributions, we made available a novel vine trunk dataset, called VineSet, which was constituted by more than 9000 images and respective annotations for each trunk. VineSet was used to train state-of-the-art Single Shot Multibox Detector models. Additionally, we deployed these models in an Edge-AI fashion and achieve high frame rate execution. Finally, an assisted annotation tool was proposed to make the process of dataset building easier and improve models incrementally. The experiments show that our trained models can detect trunks with an Average Precision up to 84.16% and our assisted annotation tool facilitates the annotation process, even in other areas of agriculture, such as orchards and forests. Additional experiments were performed, where the impact of the amount of training data and the comparison between using Transfer Learning and training from scratch were evaluated. In these cases, some theoretical assumptions were verified.
Suggested Citation
André Silva Aguiar & Nuno Namora Monteiro & Filipe Neves dos Santos & Eduardo J. Solteiro Pires & Daniel Silva & Armando Jorge Sousa & José Boaventura-Cunha, 2021.
"Bringing Semantics to the Vineyard: An Approach on Deep Learning-Based Vine Trunk Detection,"
Agriculture, MDPI, vol. 11(2), pages 1-20, February.
Handle:
RePEc:gam:jagris:v:11:y:2021:i:2:p:131-:d:494231
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jagris:v:11:y:2021:i:2:p:131-:d:494231. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.