Author
Listed:
- Tingting Yang
(College of Chemistry and Materials Engineering, Zhejiang Agriculture and Forestry University, Hangzhou 311800, China
College of Mathematics and Computer Science, Zhejiang Agriculture and Forestry University, Hangzhou 311800, China
Key Laboratory of State Forestry and Grassland Administration on Forestry Sensing Technology and Intelligent Equipment, Hangzhou 311300, China)
- Suyin Zhou
(College of Mathematics and Computer Science, Zhejiang Agriculture and Forestry University, Hangzhou 311800, China
Key Laboratory of State Forestry and Grassland Administration on Forestry Sensing Technology and Intelligent Equipment, Hangzhou 311300, China)
- Aijun Xu
(College of Mathematics and Computer Science, Zhejiang Agriculture and Forestry University, Hangzhou 311800, China
Key Laboratory of State Forestry and Grassland Administration on Forestry Sensing Technology and Intelligent Equipment, Hangzhou 311300, China)
- Junhua Ye
(College of Mathematics and Computer Science, Zhejiang Agriculture and Forestry University, Hangzhou 311800, China)
- Jianxin Yin
(College of Mathematics and Computer Science, Zhejiang Agriculture and Forestry University, Hangzhou 311800, China)
Abstract
In urban forest management, individual street tree segmentation is a fundamental method to obtain tree phenotypes, which is especially critical. Most existing tree image segmentation models have been evaluated on smaller datasets and lack experimental verification on larger, publicly available datasets. Therefore, this paper, based on a large, publicly available urban street tree dataset, proposes YOLO-SegNet for individual street tree segmentation. In the first stage of the street tree object detection task, the BiFormer attention mechanism was introduced into the YOLOv8 network to increase the contextual information extraction and improve the ability of the network to detect multiscale and multishaped targets. In the second-stage street tree segmentation task, the SegFormer network was proposed to obtain street tree edge information more efficiently. The experimental results indicate that our proposed YOLO-SegNet method, which combines YOLOv8+BiFormer and SegFormer, achieved a 92.0% mean intersection over union (mIoU), 95.9% mean pixel accuracy (mPA), and 97.4% accuracy on a large, publicly available urban street tree dataset. Compared with those of the fully convolutional neural network (FCN), lite-reduced atrous spatial pyramid pooling (LR-ASPP), pyramid scene parsing network (PSPNet), UNet, DeepLabv3+, and HRNet, the mIoUs of our YOLO-SegNet increased by 10.5, 9.7, 5.0, 6.8, 4.5, and 2.7 percentage points, respectively. The proposed method can effectively support smart agroforestry development.
Suggested Citation
Tingting Yang & Suyin Zhou & Aijun Xu & Junhua Ye & Jianxin Yin, 2024.
"YOLO-SegNet: A Method for Individual Street Tree Segmentation Based on the Improved YOLOv8 and the SegFormer Network,"
Agriculture, MDPI, vol. 14(9), pages 1-19, September.
Handle:
RePEc:gam:jagris:v:14:y:2024:i:9:p:1620-:d:1478939
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jagris:v:14:y:2024:i:9:p:1620-:d:1478939. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.