<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet type="text/xsl" href="static/style.xsl"?><OAI-PMH xmlns="http://www.openarchives.org/OAI/2.0/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/ http://www.openarchives.org/OAI/2.0/OAI-PMH.xsd"><responseDate>2026-04-13T02:58:26Z</responseDate><request verb="GetRecord" identifier="oai:www.recercat.cat:2117/439139" metadataPrefix="oai_dc">https://recercat.cat/oai/request</request><GetRecord><record><header><identifier>oai:recercat.cat:2117/439139</identifier><datestamp>2026-02-09T04:56:19Z</datestamp><setSpec>com_2072_1033</setSpec><setSpec>col_2072_452950</setSpec></header><metadata><oai_dc:dc xmlns:oai_dc="http://www.openarchives.org/OAI/2.0/oai_dc/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:doc="http://www.lyncode.com/xoai" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd">
   <dc:title>Deep learning for automated fish detection in underwater images: a tool for sustainable marine ecosystem monitoring</dc:title>
   <dc:creator>Prat Bayarri, Oriol</dc:creator>
   <dc:creator>Baños Castelló, Pol</dc:creator>
   <dc:creator>Martínez Padró, Enoc</dc:creator>
   <dc:creator>Francescangeli, Marco</dc:creator>
   <dc:creator>Toma, Daniel</dc:creator>
   <dc:creator>Carandell Widmer, Matias</dc:creator>
   <dc:creator>Prat Farran, Joana d'Arc</dc:creator>
   <dc:creator>Río Fernández, Joaquín del</dc:creator>
   <dc:contributor>Universitat Politècnica de Catalunya. Centre de Desenvolupament Tecnològic de Sistemes d'Adquisició Remota i Tractament de la Informació</dc:contributor>
   <dc:contributor>Universitat Politècnica de Catalunya. Departament d'Enginyeria Electrònica</dc:contributor>
   <dc:contributor>Universitat Politècnica de Catalunya. Departament d'Enginyeria Elèctrica</dc:contributor>
   <dc:contributor>Universitat Politècnica de Catalunya. Departament de Matemàtiques</dc:contributor>
   <dc:contributor>Universitat Politècnica de Catalunya. SARTI-MAR - Sistemes d'Adquisició Remota de dades i Tractament de la Informació en el Medi Marí</dc:contributor>
   <dc:subject>Àrees temàtiques de la UPC::Informàtica::Intel·ligència artificial</dc:subject>
   <dc:subject>Àrees temàtiques de la UPC::Enginyeria electrònica::Instrumentació i mesura</dc:subject>
   <dc:subject>Deep learning</dc:subject>
   <dc:subject>Fish detection</dc:subject>
   <dc:subject>YOLO</dc:subject>
   <dc:subject>Underwater imagery</dc:subject>
   <dc:subject>AI-assisted labeling</dc:subject>
   <dc:subject>Marine ecosystem monitoring</dc:subject>
   <dc:subject>Convolutional neural networks</dc:subject>
   <dc:subject>Object detection</dc:subject>
   <dc:subject>Machine learning</dc:subject>
   <dc:subject>Ecological data analysis</dc:subject>
   <dc:subject>Marine species classification</dc:subject>
   <dc:subject>Artificial intelligence</dc:subject>
   <dc:description>Deep learning has emerged as a powerful tool for automated object detection, offering unprecedented speed and accuracy in analyzing complex visual data. In the context of marine ecosystem monitoring, convolutional neural networks (CNNs), particularly YOLO-based architectures, have demonstrated remarkable efficiency in detecting and classifying fish species in underwater imagery. Traditional fish identification methods rely on manual annotation, which is both time-consuming and prone to inconsistencies. By implementing a semi-automated labeling approach, where human experts refine AI-generated predictions, the annotation process can be streamlined while ensuring taxonomic precision. A key aspect of this research is the creation of a comprehensive training guide that optimizes the model’s performance by detailing best practices in dataset preparation, annotation techniques, hyperparameter tuning, and augmentation strategies. Using a dataset derived from the OBSEA marine observatory, results indicate that the YOLO extra-large model, trained with a small learning rate and high-resolution images, achieves optimal performance in fish identification. The findings underscore the potential of AI-assisted methodologies in ecological research, offering a scalable and efficient alternative to manual annotation for sustainable marine biodiversity monitoring.</dc:description>
   <dc:description>This work has been supported by various funding sources and research initiatives. We acknowledge the financial support from grants 2023 INV-2 00044 (position codes 200044TC31 and 200044TC6). Additionally, this research has been funded by the European Commission’s HORIZON-INFRA-2021-SERV-01 program under the iMagine project (grant agreement 101058625). We also recognize the use of the EGI infrastructure with dedicated support from EGI-IFCA-STACK, which contributed to the computational resources required for this study. Furthermore, the researchers wish to acknowledge the support of the Associated Unit Tecnoterra, composed of members from UPC and ICM-CSIC, for their valuable collaboration in this work.</dc:description>
   <dc:description>Peer Reviewed</dc:description>
   <dc:description>Postprint (published version)</dc:description>
   <dc:date>2025-07-21</dc:date>
   <dc:type>Part of book or chapter of book</dc:type>
   <dc:identifier>Prat, O. [et al.]. Deep learning for automated fish detection in underwater images: a tool for sustainable marine ecosystem monitoring. A: «The latest advances in the field of intelligent systems». IntechOpen, 2025,</dc:identifier>
   <dc:identifier>https://hdl.handle.net/2117/439139</dc:identifier>
   <dc:identifier>10.5772/intechopen.1011280</dc:identifier>
   <dc:language>eng</dc:language>
   <dc:relation>https://www.intechopen.com/online-first/1218661</dc:relation>
   <dc:relation>info:eu-repo/grantAgreement/EC/HE/101058625/EU/Imaging data and services for aquatic science/iMagine</dc:relation>
   <dc:relation>info:eu-repo/grantAgreement/EC/HE/101094924/EU/operAtional seNsing lifE technologies for maRIne ecosystemS/ANERIS</dc:relation>
   <dc:relation>info:eu-repo/grantAgreement/EC/HE/101112883/EU/Digital Twin-sustained 4D ecological monitoring of restoration in fishery depleted areas/DIGI4ECO</dc:relation>
   <dc:rights>http://creativecommons.org/licenses/by/4.0/</dc:rights>
   <dc:rights>Open Access</dc:rights>
   <dc:rights>Attribution 4.0 International</dc:rights>
   <dc:format>application/pdf</dc:format>
   <dc:publisher>IntechOpen</dc:publisher>
</oai_dc:dc></metadata></record></GetRecord></OAI-PMH>