<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet type="text/xsl" href="static/style.xsl"?><OAI-PMH xmlns="http://www.openarchives.org/OAI/2.0/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/ http://www.openarchives.org/OAI/2.0/OAI-PMH.xsd"><responseDate>2026-04-13T06:57:45Z</responseDate><request verb="GetRecord" identifier="oai:www.recercat.cat:2117/441868" metadataPrefix="qdc">https://recercat.cat/oai/request</request><GetRecord><record><header><identifier>oai:recercat.cat:2117/441868</identifier><datestamp>2026-01-21T09:29:29Z</datestamp><setSpec>com_2072_1033</setSpec><setSpec>col_2072_452950</setSpec></header><metadata><qdc:qualifieddc xmlns:qdc="http://dspace.org/qualifieddc/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcterms="http://purl.org/dc/terms/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:doc="http://www.lyncode.com/xoai" xsi:schemaLocation="http://purl.org/dc/elements/1.1/ http://dublincore.org/schemas/xmls/qdc/2006/01/06/dc.xsd http://purl.org/dc/terms/ http://dublincore.org/schemas/xmls/qdc/2006/01/06/dcterms.xsd http://dspace.org/qualifieddc/ http://www.ukoln.ac.uk/metadata/dcmi/xmlschema/qualifieddc.xsd">
   <dc:title>Multimodal sensing prototype for robust autonomous driving under adverse weather conditions</dc:title>
   <dc:creator>Mas Giménez, Gerard de</dc:creator>
   <dc:creator>Subirana Pérez, Adrià</dc:creator>
   <dc:creator>Garcia Gómez, Pablo</dc:creator>
   <dc:creator>Bernal Pérez, Eduard</dc:creator>
   <dc:creator>Casas Pla, Josep Ramon</dc:creator>
   <dc:creator>Royo Royo, Santiago</dc:creator>
   <dc:subject>Àrees temàtiques de la UPC::Enginyeria de la telecomunicació::Radiocomunicació i exploració electromagnètica</dc:subject>
   <dc:subject>Adverse weather</dc:subject>
   <dc:subject>Autonomous driving</dc:subject>
   <dc:subject>Calibration</dc:subject>
   <dc:subject>Cameras</dc:subject>
   <dc:subject>Imaging systems</dc:subject>
   <dc:subject>LIDAR</dc:subject>
   <dc:subject>Prototyping</dc:subject>
   <dcterms:abstract>Current autonomous driving datasets face significant limitations in adverse weather conditions diversity and sensor generalization. Additionally, commonly used sensors like visible cameras or rotating LiDARs, struggle to perform under harsh weather conditions. To address these challenges, this work introduces a multimodal data acquisition system that integrates high-resolution solid-state LiDAR, automotive RADARs, a combination of visible, thermal, SWIR, and polarimetric cameras, and a GNSS/INS system for odometry and localization. This diverse sensor suite ensures robust performance in low-visibility environments, such as fog or heavy rain, by providing complementary and redundant information. Controlled by an autonomous-safe Nvidia DRIVE AGX with a ROS-based architecture, the system enables precise spatial calibration, temporal synchronization, and realtime data fusion and perception algorithms across an overlapped field of view of 60°x 20°. With this system, this work aims to publish in the coming months an open-source multimodal labeled dataset for autonomous driving complemented by synthetic data generation through a digital twin in Nvidia’s Omniverse platform. The data set will have a similar structure as the well-known nuScenes, and will have a similar developer kit to navigate through it with ease. The data set will support a wide range of autonomous driving applications like 3D multimodal object detection and tracking, SLAM, depth completion, and perception enhancement through challenging scenarios.</dcterms:abstract>
   <dcterms:abstract>This work has been sponsored by the Spanish Ministry of Science through projects TED2021-132338B-I00 and PID2023-152427OB-I00 and by the European Commission through project GA-101139941. This work is part of a Ph.D. thesis with grant number 2023 FI-1 00229 co-funded by the European Union.</dcterms:abstract>
   <dcterms:abstract>Postprint (author's final draft)</dcterms:abstract>
   <dcterms:issued>2025</dcterms:issued>
   <dc:type>Conference lecture</dc:type>
   <dc:relation>https://www.spiedigitallibrary.org/conference-proceedings-of-spie/13567/135672E/Multimodal-sensing-prototype-for-robust-autonomous-driving-under-adverse-weather/10.1117/12.3062388.short</dc:relation>
   <dc:relation>info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2021-2023/PID2023-152427OB-I00/ES/SENSOR MULTIMODAL BASADO EN LIDAR PARA SENSADO AMBIENTAL EN MEDIOS SUBMARINOS/</dc:relation>
   <dc:rights>Restricted access - publisher's policy</dc:rights>
   <dc:publisher>International Society for Photo-Optical Instrumentation Engineers (SPIE)</dc:publisher>
</qdc:qualifieddc></metadata></record></GetRecord></OAI-PMH>