<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet type="text/xsl" href="static/style.xsl"?><OAI-PMH xmlns="http://www.openarchives.org/OAI/2.0/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/ http://www.openarchives.org/OAI/2.0/OAI-PMH.xsd"><responseDate>2026-04-14T02:10:08Z</responseDate><request verb="GetRecord" identifier="oai:www.recercat.cat:10459.1/71527" metadataPrefix="didl">https://recercat.cat/oai/request</request><GetRecord><record><header><identifier>oai:recercat.cat:10459.1/71527</identifier><datestamp>2025-06-16T19:03:44Z</datestamp><setSpec>com_2072_3622</setSpec><setSpec>col_2072_479130</setSpec></header><metadata><d:DIDL xmlns:d="urn:mpeg:mpeg21:2002:02-DIDL-NS" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:doc="http://www.lyncode.com/xoai" xsi:schemaLocation="urn:mpeg:mpeg21:2002:02-DIDL-NS http://standards.iso.org/ittf/PubliclyAvailableStandards/MPEG-21_schema_files/did/didl.xsd">
   <d:Item id="hdl_10459.1_71527">
      <d:Descriptor>
         <d:Statement mimeType="application/xml; charset=utf-8">
            <dii:Identifier xmlns:dii="urn:mpeg:mpeg21:2002:01-DII-NS" xsi:schemaLocation="urn:mpeg:mpeg21:2002:01-DII-NS http://standards.iso.org/ittf/PubliclyAvailableStandards/MPEG-21_schema_files/dii/dii.xsd">urn:hdl:10459.1/71527</dii:Identifier>
         </d:Statement>
      </d:Descriptor>
      <d:Descriptor>
         <d:Statement mimeType="application/xml; charset=utf-8">
            <oai_dc:dc xmlns:oai_dc="http://www.openarchives.org/OAI/2.0/oai_dc/" xmlns:dc="http://purl.org/dc/elements/1.1/" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd">
               <dc:title>Comparison of 3D scan matching techniques for autonomous robot navigation in urban and agricultural environments</dc:title>
               <dc:creator>Guevara, Javier</dc:creator>
               <dc:creator>Gené Mola, Jordi</dc:creator>
               <dc:creator>Gregorio López, Eduard</dc:creator>
               <dc:creator>Torres-Torriti, Miguel</dc:creator>
               <dc:creator>Reina, Giulio</dc:creator>
               <dc:creator>Auat Cheein, Fernando</dc:creator>
               <dc:subject>Autonomous vehicles</dc:subject>
               <dc:subject>3D point cloud registration</dc:subject>
               <dc:subject>Mobile robot sensing</dc:subject>
               <dc:subject>Robot localization</dc:subject>
               <dc:description>Global navigation satellite system (GNSS) is the standard solution for solving the localization problem in outdoor environments, but its signal might be lost when driving in dense urban areas or in the presence of heavy vegetation or overhanging canopies. Hence, there is a need for alternative or complementary localization methods for autonomous driving. In recent years, exteroceptive sensors have gained much attention due to significant improvements in accuracy and cost-effectiveness, especially for 3D range sensors. By registering two successive 3D scans, known as scan matching, it is possible to estimate the pose of a vehicle. This work aims to provide in-depth analysis and comparison of the state-of-the-art 3D scan matching approaches as a solution to the localization problem of autonomous vehicles. Eight techniques (deterministic and probabilistic) are investigated: iterative closest point (with three different embodiments), normal distribution transform, coherent point drift, Gaussian mixture model, support vector-parametrized Gaussian mixture and the particle filter implementation. They are demonstrated in long path trials in both urban and agricultural environments and compared in terms of accuracy and consistency. On the one hand, most of the techniques can be successfully used in urban scenarios with the probabilistic approaches that show the best accuracy. On the other hand, agricultural settings have proved to be more challenging with significant errors even in short distance trials due to the presence of featureless natural objects. The results and discussion of this work will provide a guide for selecting the most suitable method and will encourage building of improvements on the identified limitations.</dc:description>
               <dc:description>This project has been supported by the National Agency of Research and Development (ANID, ex-Conicyt)&#xd;
under Fondecyt grant 1201319, Basal grant FB0008, DGIIP-UTFSM Chile, National Agency for Research&#xd;
 and Development (ANID)/PCHA/Doctorado Nacional/2020-21200700, Secretaria d’Universitats i Recerca del Departament d’Empresa i Coneixement de la Generalitat de Catalunya (grant 2017 SGR 646), the Span ish Ministry of Science, Innovation and Universities (project RTI2018- 094222-B-I00) for partially funding this research. The Spanish Ministry of Education is thanked for Mr. J. Gene’s pre-doctoral fellowships (FPU15/03355). We would also like to thank Nufri (especially Santiago Salamero and Oriol Morreres) for their support during data acquisition</dc:description>
               <dc:date>2021-04-23</dc:date>
               <dc:type>info:eu-repo/semantics/article</dc:type>
               <dc:type>acceptedVersion</dc:type>
               <dc:relation>info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2017-2020/RTI2018-094222-B-I00/ES/TECNOLOGIAS DE AGRICULTURA DE PRECISION PARA OPTIMIZAR EL MANEJO DEL DOSEL FOLIAR Y LA PROTECCION FITOSANITARIA SOSTENIBLE EN PLANTACIONES FRUTALES/</dc:relation>
               <dc:relation>Versió postprint del document publicat a: https://doi.org/10.1117/1.JRS.15.024508</dc:relation>
               <dc:relation>Journal Of Applied Remote Sensing, 2021, vol. 16 , núm. 2, p. 024508</dc:relation>
               <dc:relation>https://doi.org/10.34810/data2320</dc:relation>
               <dc:rights>(c) Society of Photo-Optical Instrumentation Engineers, 2021</dc:rights>
               <dc:rights>info:eu-repo/semantics/openAccess</dc:rights>
               <dc:publisher>Society of Photo-Optical Instrumentation Engineers</dc:publisher>
            </oai_dc:dc>
         </d:Statement>
      </d:Descriptor>
   </d:Item>
</d:DIDL></metadata></record></GetRecord></OAI-PMH>