Author
Listed:
- Muhammad Atif Butt
- Asad Masood Khattak
- Sarmad Shafique
- Bashir Hayat
- Saima Abid
- Ki-Il Kim
- Muhammad Waqas Ayub
- Ahthasham Sajid
- Awais Adnan
- M. Irfan Uddin
Abstract
In step with rapid advancements in computer vision, vehicle classification demonstrates a considerable potential to reshape intelligent transportation systems. In the last couple of decades, image processing and pattern recognition-based vehicle classification systems have been used to improve the effectiveness of automated highway toll collection and traffic monitoring systems. However, these methods are trained on limited handcrafted features extracted from small datasets, which do not cater the real-time road traffic conditions. Deep learning-based classification systems have been proposed to incorporate the above-mentioned issues in traditional methods. However, convolutional neural networks require piles of data including noise, weather, and illumination factors to ensure robustness in real-time applications. Moreover, there is no generalized dataset available to validate the efficacy of vehicle classification systems. To overcome these issues, we propose a convolutional neural network-based vehicle classification system to improve robustness of vehicle classification in real-time applications. We present a vehicle dataset comprising of 10,000 images categorized into six-common vehicle classes considering adverse illuminous conditions to achieve robustness in real-time vehicle classification systems. Initially, pretrained AlexNet, GoogleNet, Inception-v3, VGG, and ResNet are fine-tuned on self-constructed vehicle dataset to evaluate their performance in terms of accuracy and convergence. Based on better performance, ResNet architecture is further improved by adding a new classification block in the network. To ensure generalization, we fine-tuned the network on the public VeRi dataset containing 50,000 images, which have been categorized into six vehicle classes. Finally, a comparison study has been carried out between the proposed and existing vehicle classification methods to evaluate the effectiveness of the proposed vehicle classification system. Consequently, our proposed system achieved 99.68%, 99.65%, and 99.56% accuracy, precision, and F1-score on our self-constructed dataset.
Suggested Citation
Muhammad Atif Butt & Asad Masood Khattak & Sarmad Shafique & Bashir Hayat & Saima Abid & Ki-Il Kim & Muhammad Waqas Ayub & Ahthasham Sajid & Awais Adnan & M. Irfan Uddin, 2021.
"Convolutional Neural Network Based Vehicle Classification in Adverse Illuminous Conditions for Intelligent Transportation Systems,"
Complexity, Hindawi, vol. 2021, pages 1-11, February.
Handle:
RePEc:hin:complx:6644861
DOI: 10.1155/2021/6644861
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hin:complx:6644861. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Mohamed Abdelhakeem (email available below). General contact details of provider: https://www.hindawi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.