IDEAS home Printed from https://ideas.repec.org/a/wsi/fracta/v32y2024i09n10ns0218348x2540047x.html
   My bibliography  Save this article

From Pixels To Predictions: Role Of Boosted Deep Learning-Enabled Object Detection For Autonomous Vehicles On Large Scale Consumer Electronics Environment

Author

Listed:
  • MIMOUNA ABDULLAH ALKHONAINI

    (Department of Computer Science, College of Computer and Information Sciences, Prince Sultan University, Saudi Arabia)

  • HANAN ABDULLAH MENGASH

    (��Department of Information Systems, College of Computer and Information Sciences, Princess Nourah bint Abdulrahman University, P. O. Box 84428, Riyadh 11671, Saudi Arabia)

  • NADHEM NEMRI

    (��Department of Information Systems, Applied College at Mahayil, King Khalid University, Saudi Arabia)

  • SHOUKI A. EBAD

    (�Department of Computer Science, Faculty of Science, Northern Border University, Arar 91431, Saudi Arabia)

  • FAIZ ABDULLAH ALOTAIBI

    (�Department of Information Science, College of Humanities and Social Sciences, King Saud University, P. O. Box 28095, Riyadh 11437, Saudi Arabia)

  • JAWHARA ALJABRI

    (��Department of Computer Science, University College in Umluj, University of Tabuk, Saudi Arabia)

  • YAZEED ALZAHRANI

    (*Department of Computer Engineering, College of Engineering in Wadi Addawasir, Prince Sattam bin Abdulaziz University, Saudi Arabia)

  • MRIM M. ALNFIAI

    (��†Department of Information Technology, College of Computers and Information Technology, Taif University, P. O. Box 11099, Taif 21944, Saudi Arabia)

Abstract

Consumer electronics (CE) companies have the potential to significantly contribute to the advancement of autonomous vehicles and their accompanying technology by providing security, connectivity, and efficiency. The Consumer Autonomous Vehicles market is set for significant growth, driven by growing awareness and implementation of sustainable practices using computing technologies for traffic flow optimization in smart cities. Businesses are concentrating more on eco-friendly solutions, using AI, communication networks, and sensors for autonomous city navigation, giving safer and more efficient mobility solutions in response to growing environmental concerns. Object detection is a crucial element of autonomous vehicles and complex systems, which enables them to observe and react to their surroundings in real-time. Multiple autonomous vehicles employ deep learning (DL) for detection and deploy specific sensor arrays custom-made to their use case or environment. DL processes sensory data for autonomous vehicles, enabling data-driven decisions on environmental reactions and obstacle recognition. This paper projects a Galactical Swarm Fractals Optimizer with DL-Enabled Object Detection for Autonomous Vehicles (GSODL-OOAV) model in Smart Cities. The presented GSODL-OOAV model enables the object identification for autonomous vehicles properly. To accomplish this, the GSODL-OOAV model initially employs a RetinaNet object detector to detect the objects effectively. Besides, the long short-term memory ensemble (BLSTME) technique was exploited to allot proper classes to the detected objects. A hyperparameter tuning procedure utilizing the GSO model is employed to enhance the classification efficiency of the BLSTME approach. The experimentation validation of the GSODL-OOAV technique is verified using the BDD100K database. The comparative study of the GSODL-OOAV approach illustrated a superior accuracy outcome of 99.06% over present innovative approaches.

Suggested Citation

  • Mimouna Abdullah Alkhonaini & Hanan Abdullah Mengash & Nadhem Nemri & Shouki A. Ebad & Faiz Abdullah Alotaibi & Jawhara Aljabri & Yazeed Alzahrani & Mrim M. Alnfiai, 2024. "From Pixels To Predictions: Role Of Boosted Deep Learning-Enabled Object Detection For Autonomous Vehicles On Large Scale Consumer Electronics Environment," FRACTALS (fractals), World Scientific Publishing Co. Pte. Ltd., vol. 32(09n10), pages 1-17.
  • Handle: RePEc:wsi:fracta:v:32:y:2024:i:09n10:n:s0218348x2540047x
    DOI: 10.1142/S0218348X2540047X
    as

    Download full text from publisher

    File URL: http://www.worldscientific.com/doi/abs/10.1142/S0218348X2540047X
    Download Restriction: Access to full text is restricted to subscribers

    File URL: https://libkey.io/10.1142/S0218348X2540047X?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wsi:fracta:v:32:y:2024:i:09n10:n:s0218348x2540047x. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Tai Tone Lim (email available below). General contact details of provider: https://www.worldscientific.com/worldscinet/fractals .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.