Author
Listed:
- Shouwei Wang
(College of Optical, Mechanical and Electrical Engineering, Zhejiang A&F University, Hangzhou 311300, China
National Engineering Technology Research Center of State Forestry and Grassland Administration on Forestry and Grassland Machinery for Hilly and Mountainous Areas, Hangzhou 311300, China
Key Laboratory of Agricultural Equipment for Hilly and Mountainous Areas in South-Eastern China (Co-Construction by Ministry and Province), Ministry of Agriculture and Rural Affairs, Hangzhou 311300, China)
- Lijian Yao
(College of Optical, Mechanical and Electrical Engineering, Zhejiang A&F University, Hangzhou 311300, China
National Engineering Technology Research Center of State Forestry and Grassland Administration on Forestry and Grassland Machinery for Hilly and Mountainous Areas, Hangzhou 311300, China
Key Laboratory of Agricultural Equipment for Hilly and Mountainous Areas in South-Eastern China (Co-Construction by Ministry and Province), Ministry of Agriculture and Rural Affairs, Hangzhou 311300, China)
- Lijun Xu
(College of Optical, Mechanical and Electrical Engineering, Zhejiang A&F University, Hangzhou 311300, China
National Engineering Technology Research Center of State Forestry and Grassland Administration on Forestry and Grassland Machinery for Hilly and Mountainous Areas, Hangzhou 311300, China
Key Laboratory of Agricultural Equipment for Hilly and Mountainous Areas in South-Eastern China (Co-Construction by Ministry and Province), Ministry of Agriculture and Rural Affairs, Hangzhou 311300, China)
- Dong Hu
(College of Optical, Mechanical and Electrical Engineering, Zhejiang A&F University, Hangzhou 311300, China
National Engineering Technology Research Center of State Forestry and Grassland Administration on Forestry and Grassland Machinery for Hilly and Mountainous Areas, Hangzhou 311300, China
Key Laboratory of Agricultural Equipment for Hilly and Mountainous Areas in South-Eastern China (Co-Construction by Ministry and Province), Ministry of Agriculture and Rural Affairs, Hangzhou 311300, China)
- Jiawei Zhou
(College of Optical, Mechanical and Electrical Engineering, Zhejiang A&F University, Hangzhou 311300, China
National Engineering Technology Research Center of State Forestry and Grassland Administration on Forestry and Grassland Machinery for Hilly and Mountainous Areas, Hangzhou 311300, China
Key Laboratory of Agricultural Equipment for Hilly and Mountainous Areas in South-Eastern China (Co-Construction by Ministry and Province), Ministry of Agriculture and Rural Affairs, Hangzhou 311300, China)
- Yexin Chen
(College of Optical, Mechanical and Electrical Engineering, Zhejiang A&F University, Hangzhou 311300, China
National Engineering Technology Research Center of State Forestry and Grassland Administration on Forestry and Grassland Machinery for Hilly and Mountainous Areas, Hangzhou 311300, China
Key Laboratory of Agricultural Equipment for Hilly and Mountainous Areas in South-Eastern China (Co-Construction by Ministry and Province), Ministry of Agriculture and Rural Affairs, Hangzhou 311300, China)
Abstract
In response to the limitations of existing methods in differentiating between vegetables and all types of weeds in farmlands, a new image segmentation method is proposed based on the improved YOLOv7-tiny. Building on the original YOLOv7-tiny framework, we replace the CIoU loss function with the WIoU loss function, substitute the Leaky ReLU loss function with the SiLU activation function, introduce the SimAM attention mechanism in the neck network, and integrate the PConv convolution module into the backbone network. The improved YOLOv7-tiny is used for vegetable target detection, while the ExG index, in combination with the OTSU method, is utilized to obtain a foreground image that includes both vegetables and weeds. By integrating the vegetable detection results with the foreground image, a vegetable distribution map is generated. Subsequently, by excluding the vegetable targets from the foreground image using the vegetable distribution map, a single weed target is obtained, thereby achieving accurate segmentation between vegetables and weeds. The experimental results show that the improved YOLOv7-tiny achieves an average precision of 96.5% for vegetable detection, with a frame rate of 89.3 fps, Params of 8.2 M, and FLOPs of 10.9 G, surpassing the original YOLOv7-tiny in both detection accuracy and speed. The image segmentation algorithm achieves a mIoU of 84.8% and an mPA of 97.8%. This method can effectively segment vegetables and a variety of weeds, reduce the complexity of segmentation with good feasibility, and provide a reference for the development of intelligent plant protection robots.
Suggested Citation
Shouwei Wang & Lijian Yao & Lijun Xu & Dong Hu & Jiawei Zhou & Yexin Chen, 2024.
"An Improved YOLOv7-Tiny Method for the Segmentation of Images of Vegetable Fields,"
Agriculture, MDPI, vol. 14(6), pages 1-16, May.
Handle:
RePEc:gam:jagris:v:14:y:2024:i:6:p:856-:d:1404943
Download full text from publisher
Most related items
These are the items that most often cite the same works as this one and are cited by the same works as this one.
- Jingyu Wang & Miaomiao Li & Chen Han & Xindong Guo, 2024.
"YOLOv8-RCAA: A Lightweight and High-Performance Network for Tea Leaf Disease Detection,"
Agriculture, MDPI, vol. 14(8), pages 1-20, July.
- Abdullah Addas & Muhammad Tahir & Najma Ismat, 2023.
"Enhancing Precision of Crop Farming towards Smart Cities: An Application of Artificial Intelligence,"
Sustainability, MDPI, vol. 16(1), pages 1-18, December.
- Shenghao Ye & Xinyu Xue & Shuning Si & Yang Xu & Feixiang Le & Longfei Cui & Yongkui Jin, 2023.
"Design and Testing of an Elastic Comb Reciprocating a Soybean Plant-to-Plant Seedling Avoidance and Weeding Device,"
Agriculture, MDPI, vol. 13(11), pages 1-23, November.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jagris:v:14:y:2024:i:6:p:856-:d:1404943. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.