Abstract

Accurate localization in autonomous robots enables effective decision-making within their operating environment. Various methods have been developed to address this challenge, encompassing traditional techniques, fiducial marker utilization, and machine learning approaches. This work proposes a deep-learning solution employing Convolutional Neural Networks (CNN) to tackle the localization problem, specifically in the context of the RobotAtFactory 4.0 competition. The proposed approach leverages transfer learning from the pre-trained VGG16 model to capitalize on its existing knowledge. To validate the effectiveness of the approach, a simulated scenario was employed. The experimental results demonstrated an error within the millimeter scale and rapid response times in milliseconds. Notably, the presented approach offers several advantages, including a consistent model size regardless of the number of training images utilized and the elimination of the need to know the absolute positions of the fiducial markers.
Original languageEnglish
Pages181-194
Number of pages14
DOIs
Publication statusPublished - 3 Feb 2024
EventOptimization, Learning Algorithms and Applications - OL2A 2023 - Ponta Delgada, Portugal
Duration: 27 Sept 202329 Sept 2023
https://ol2a.ipb.pt/ol2a_2023.html

Conference

ConferenceOptimization, Learning Algorithms and Applications - OL2A 2023
Abbreviated titleOL2A 2023
Country/TerritoryPortugal
CityPonta Delgada
Period27/09/2329/09/23
Internet address

Keywords

  • CNN
  • indoor localization
  • robotics competitions

Fingerprint

Dive into the research topics of 'Deep Learning-Based Localization Approach for Autonomous Robots in the RobotAtFactory 4.0 Competition'. Together they form a unique fingerprint.
  • Best Paper Award - OL2A 2023

    Klein, Luan Carlos (Recipient), Mendes, João (Recipient), Braun, João (Recipient), Martins, Felipe (Recipient), Schneider de Oliveira, André (Recipient), Costa, Paulo Gomes (Recipient), Wörtche, Heinrich (Recipient) & Lima, José (Recipient), 29 Sept 2023

    Prize: Prize (including medals and awards)

    File

Cite this