<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet type="text/xsl" href="static/style.xsl"?><OAI-PMH xmlns="http://www.openarchives.org/OAI/2.0/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/ http://www.openarchives.org/OAI/2.0/OAI-PMH.xsd"><responseDate>2026-04-14T06:26:37Z</responseDate><request verb="GetRecord" identifier="oai:www.recercat.cat:10256/28379" metadataPrefix="qdc">https://recercat.cat/oai/request</request><GetRecord><record><header><identifier>oai:recercat.cat:10256/28379</identifier><datestamp>2026-03-07T19:50:54Z</datestamp><setSpec>com_2072_452966</setSpec><setSpec>com_2072_2054</setSpec><setSpec>col_2072_452969</setSpec></header><metadata><qdc:qualifieddc xmlns:qdc="http://dspace.org/qualifieddc/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcterms="http://purl.org/dc/terms/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:doc="http://www.lyncode.com/xoai" xsi:schemaLocation="http://purl.org/dc/elements/1.1/ http://dublincore.org/schemas/xmls/qdc/2006/01/06/dc.xsd http://purl.org/dc/terms/ http://dublincore.org/schemas/xmls/qdc/2006/01/06/dcterms.xsd http://dspace.org/qualifieddc/ http://www.ukoln.ac.uk/metadata/dcmi/xmlschema/qualifieddc.xsd">
   <dc:title>Joint underwater mapping with acoustic and optical Iimages</dc:title>
   <dc:creator>Philip-Ifabiyi, Precious</dc:creator>
   <dc:subject>Autonomous Underwater Vehicles</dc:subject>
   <dc:subject>Autonomous Underwater Vehicles -- Navigation systems</dc:subject>
   <dc:subject>Vehicles submergibles autònoms -- Sistemes de navigació</dc:subject>
   <dc:subject>Digital mapping</dc:subject>
   <dc:subject>Cartografia digital</dc:subject>
   <dc:subject>SLAM</dc:subject>
   <dc:subject>Sonar</dc:subject>
   <dc:subject>Sonar (Navegació)</dc:subject>
   <dc:subject>Algorismes</dc:subject>
   <dc:subject>Algorithms</dc:subject>
   <dcterms:abstract>This thesis is developed within the context of the IURBI project [1], which seeks to&#xd;
develop an intelligent AUV capable of real-time seafloor analysis and adaptive mission&#xd;
planning (Figure 1.1). A fundamental prerequisite for such autonomous capabilities is&#xd;
the ability to robustly align and fuse sensor data from multiple sources and surveys into a&#xd;
single, coherent model. This thesis addresses that foundational challenge by developing&#xd;
a comprehensive offline framework for multi-session, multimodal map alignment.&#xd;
The primary objectives of this thesis are to:&#xd;
– Develop a robust and flexible framework for the alignment and integration of&#xd;
side-scan sonar and optical imagery acquired in single or multiple sessions by&#xd;
AUVs, towfish, or ROVs.&#xd;
– Formulate and implement a factor graph optimization approach to jointly re fine vehicle trajectories and sensor alignments across multiple sessions and&#xd;
modalities, accommodating the inherent uncertainties in underwater navigation.&#xd;
– Evaluate the performance of the proposed methodology using real-world under water datasets, assessing its accuracy, robustness, and practical applicability. The scope of this work encompasses the offline processing and alignment of pre viously collected side-scan sonar and optical image datasets. While initial navigation&#xd;
data from the AUV/ROV is assumed to be available, this work specifically focuses on&#xd;
refining these initial pose estimates to achieve precise multimodal and multi-session&#xd;
co-registration.</dcterms:abstract>
   <dcterms:abstract>9</dcterms:abstract>
   <dcterms:dateAccepted>2026-03-07T19:50:54Z</dcterms:dateAccepted>
   <dcterms:available>2026-03-07T19:50:54Z</dcterms:available>
   <dcterms:created>2026-03-07T19:50:54Z</dcterms:created>
   <dcterms:issued>2025-06</dcterms:issued>
   <dc:type>info:eu-repo/semantics/masterThesis</dc:type>
   <dc:identifier>https://hdl.handle.net/10256/28379</dc:identifier>
   <dc:rights>Attribution-NonCommercial-NoDerivatives 4.0 International</dc:rights>
   <dc:rights>http://creativecommons.org/licenses/by-nc-nd/4.0/</dc:rights>
   <dc:rights>info:eu-repo/semantics/openAccess</dc:rights>
   <dc:publisher>Universitat de Girona. Institut de Recerca en Visió per Computador i Robòtica</dc:publisher>
   <dc:source>Erasmus Mundus Joint Master in Intelligent Field Robotic Systems (IFROS)</dc:source>
</qdc:qualifieddc></metadata></record></GetRecord></OAI-PMH>