IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v13y2025i3p467-d1580719.html
   My bibliography  Save this article

Efficient Robot Localization Through Deep Learning-Based Natural Fiduciary Pattern Recognition

Author

Listed:
  • Ramón Alberto Mena-Almonte

    (Instituto Tecnológico de Las Américas (ITLA), La Caleta, Boca Chica 11606, Dominican Republic)

  • Ekaitz Zulueta

    (System Engineering and Automation Control Department, University of the Basque Country (UPV/EHU), 01006 Vitoria-Gasteiz, Spain)

  • Ismael Etxeberria-Agiriano

    (Department of Computer Languages and Systems, University College of Engineering, University of the Basque Country (UPV/EHU), 01006 Vitoria-Gasteiz, Spain)

  • Unai Fernandez-Gamiz

    (Department Energy Engineering, University of the Basque Country (UPV/EHU), 01006 Vitoria-Gasteiz, Spain)

Abstract

This paper introduces an efficient localization algorithm for robotic systems, utilizing deep learning to identify and exploit natural fiduciary patterns within the environment. Diverging from conventional localization techniques that depend on artificial markers, this method capitalizes on the inherent environmental features to enhance both accuracy and computational efficiency. By integrating advanced deep learning frameworks with natural scene analysis, the proposed algorithm facilitates robust, real-time localization in dynamic and unstructured settings. The resulting approach offers significant improvements in adaptability, precision, and operational efficiency, representing a substantial contribution to the field of autonomous robotics. We are aiming at analyzing an automotive manufacturing scenario to achieve robotic localization related to a moving target. To work with a simpler and more accessible scenario we have chosen a demonstrative context consisting of a laboratory wall containing some elements. This paper will focus on the first part of the case study, with a continuation planned for future work. It will demonstrate a scenario in which a camera is mounted on a robot, capturing images of the underside of a car (which we assume to be represented by a gray painted surface with specific elements to be described in Materials and Methods). These images are processed by a convolutional neural network (CNN), designed to detect the most distinctive features of the environment. The extracted information is crucial, as the identified characteristic areas will serve as reference points for the real-time localization of the industrial robot. In this work, we have demonstrated the potential of leveraging natural fiduciary patterns for efficient and accurate robot localization. By utilizing deep learning, specifically convolutional neural networks. The experimental results suggest that this approach is not only feasible but also scalable across a wide range of applications, including industrial automation autonomous vehicles, and aerospace navigation. As robots increasingly operate in environments where computational efficiency and adaptability are paramount, our methodology offers a viable solution to enhance localization without compromising accuracy or speed. The proposal of an algorithm that enables the application of the proposed method for natural fiduciary patterns based on neural networks to more complex scenarios is highlighted, along with the efficiency of the method for robot localization compared to others.

Suggested Citation

  • Ramón Alberto Mena-Almonte & Ekaitz Zulueta & Ismael Etxeberria-Agiriano & Unai Fernandez-Gamiz, 2025. "Efficient Robot Localization Through Deep Learning-Based Natural Fiduciary Pattern Recognition," Mathematics, MDPI, vol. 13(3), pages 1-14, January.
  • Handle: RePEc:gam:jmathe:v:13:y:2025:i:3:p:467-:d:1580719
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/13/3/467/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/13/3/467/
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:13:y:2025:i:3:p:467-:d:1580719. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.