Author
Listed:
- Mohd Firdaus Ibrahim
(Department of Biological and Agricultural Engineering, Faculty of Engineering, Universiti Putra Malaysia, Serdang 43400, Malaysia
Faculty of Mechanical Engineering and Technology, Universiti Malaysia Perlis, Arau 02600, Malaysia)
- Siti Khairunniza-Bejo
(Department of Biological and Agricultural Engineering, Faculty of Engineering, Universiti Putra Malaysia, Serdang 43400, Malaysia
Smart Farming Technology Research Centre, Universiti Putra Malaysia, Serdang 43400, Malaysia
Institute of Plantation Studies, Universiti Putra Malaysia, Serdang 43400, Malaysia)
- Marsyita Hanafi
(Department of Computer and Communication Systems Engineering, Faculty of Engineering, Universiti Putra Malaysia, Serdang 43400, Malaysia)
- Mahirah Jahari
(Department of Biological and Agricultural Engineering, Faculty of Engineering, Universiti Putra Malaysia, Serdang 43400, Malaysia
Smart Farming Technology Research Centre, Universiti Putra Malaysia, Serdang 43400, Malaysia)
- Fathinul Syahir Ahmad Saad
(Faculty of Electrical Engineering and Technology, Universiti Malaysia Perlis, Arau 02600, Malaysia)
- Mohammad Aufa Mhd Bookeri
(Engineering Research Centre, Malaysian Agriculture Research and Development Institute, Seberang Perai 13200, Malaysia)
Abstract
Rice serves as the primary food source for nearly half of the global population, with Asia accounting for approximately 90% of rice production worldwide. However, rice farming faces significant losses due to pest attacks. To prevent pest infestations, it is crucial to apply appropriate pesticides specific to the type of pest in the field. Traditionally, pest identification and counting have been performed manually using sticky light traps, but this process is time-consuming. In this study, a machine vision system was developed using a dataset of 7328 high-density images (1229 pixels per centimetre) of planthoppers collected in the field using sticky light traps. The dataset included four planthopper classes: brown planthopper (BPH), green leafhopper (GLH), white-backed planthopper (WBPH), and zigzag leafhopper (ZIGZAG). Five deep CNN models—ResNet-50, ResNet-101, ResNet-152, VGG-16, and VGG-19—were applied and tuned to classify the planthopper species. The experimental results indicated that the ResNet-50 model performed the best overall, achieving average values of 97.28% for accuracy, 92.05% for precision, 94.47% for recall, and 93.07% for the F1-score. In conclusion, this study successfully classified planthopper classes with excellent performance by utilising deep CNN architectures on a high-density image dataset. This capability has the potential to serve as a tool for classifying and counting planthopper samples collected using light traps.
Suggested Citation
Mohd Firdaus Ibrahim & Siti Khairunniza-Bejo & Marsyita Hanafi & Mahirah Jahari & Fathinul Syahir Ahmad Saad & Mohammad Aufa Mhd Bookeri, 2023.
"Deep CNN-Based Planthopper Classification Using a High-Density Image Dataset,"
Agriculture, MDPI, vol. 13(6), pages 1-17, May.
Handle:
RePEc:gam:jagris:v:13:y:2023:i:6:p:1155-:d:1159509
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jagris:v:13:y:2023:i:6:p:1155-:d:1159509. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.