Author
Listed:
- Shanghao Liu
(College of Information Engineering, Northwest A&F University, Xianyang 712100, China
Research Center of Information Technology, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China)
- Chunjiang Zhao
(College of Information Engineering, Northwest A&F University, Xianyang 712100, China
Research Center of Information Technology, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China)
- Hongming Zhang
(College of Information Engineering, Northwest A&F University, Xianyang 712100, China)
- Qifeng Li
(Research Center of Information Technology, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China)
- Shuqin Li
(College of Information Engineering, Northwest A&F University, Xianyang 712100, China)
- Yini Chen
(Research Center of Information Technology, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
School of Mathematics and Physics, North China Electric Power University, Beijing 102206, China)
- Ronghua Gao
(Research Center of Information Technology, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China)
- Rong Wang
(College of Information Engineering, Northwest A&F University, Xianyang 712100, China
Research Center of Information Technology, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China)
- Xuwen Li
(Research Center of Information Technology, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
School of Computer and Information Engineering, Tianjin Agricultural University, Tianjin 300384, China)
Abstract
A clear understanding of the number of pigs plays a crucial role in breeding management. Computer vision technology possesses several advantages, as it is harmless and labour-saving compared to traditional counting methods. Nevertheless, the existing methods still face some challenges, such as: (1) the lack of a substantial high-precision pig-counting dataset; (2) creating a dataset for instance segmentation can be time-consuming and labor-intensive; (3) interactive occlusion and overlapping always lead to incorrect recognition of pigs; (4) existing methods for counting such as object detection have limited accuracy. To address the issues of dataset scarcity and labor-intensive manual labeling, we make a semi-auto instance labeling tool (SAI) to help us to produce a high-precision pig counting dataset named Count1200 including 1220 images and 25,762 instances. The speed at which we make labels far exceeds the speed of manual annotation. A concise and efficient instance segmentation model built upon several novel modules, referred to as the Instances Counting Network (ICNet), is proposed in this paper for pig counting. ICNet is a dual-branch model ingeniously formed of a combination of several layers, which is named the Parallel Deformable Convolutions Layer (PDCL), which is trained from scratch and primarily composed of a couple of parallel deformable convolution blocks (PDCBs). We effectively leverage the characteristic of modeling long-range sequences to build our basic block and compute layer. Along with the benefits of a large effective receptive field, PDCL achieves a better performance for multi-scale objects. In the trade-off between computational resources and performance, ICNet demonstrates excellent performance and surpasses other models in Count1200, A P of 71.4% and A P 50 of 95.7% are obtained in our experiments. This work provides inspiration for the rapid creation of high-precision datasets and proposes an accurate approach to pig counting.
Suggested Citation
Shanghao Liu & Chunjiang Zhao & Hongming Zhang & Qifeng Li & Shuqin Li & Yini Chen & Ronghua Gao & Rong Wang & Xuwen Li, 2024.
"ICNet: A Dual-Branch Instance Segmentation Network for High-Precision Pig Counting,"
Agriculture, MDPI, vol. 14(1), pages 1-15, January.
Handle:
RePEc:gam:jagris:v:14:y:2024:i:1:p:141-:d:1321486
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jagris:v:14:y:2024:i:1:p:141-:d:1321486. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.