<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet type="text/xsl" href="static/style.xsl"?><OAI-PMH xmlns="http://www.openarchives.org/OAI/2.0/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/ http://www.openarchives.org/OAI/2.0/OAI-PMH.xsd"><responseDate>2026-04-17T03:23:39Z</responseDate><request verb="GetRecord" identifier="oai:www.recercat.cat:2117/424843" metadataPrefix="didl">https://recercat.cat/oai/request</request><GetRecord><record><header><identifier>oai:recercat.cat:2117/424843</identifier><datestamp>2025-07-17T02:02:46Z</datestamp><setSpec>com_2072_1033</setSpec><setSpec>col_2072_452950</setSpec></header><metadata><d:DIDL xmlns:d="urn:mpeg:mpeg21:2002:02-DIDL-NS" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:doc="http://www.lyncode.com/xoai" xsi:schemaLocation="urn:mpeg:mpeg21:2002:02-DIDL-NS http://standards.iso.org/ittf/PubliclyAvailableStandards/MPEG-21_schema_files/did/didl.xsd">
   <d:Item id="hdl_2117_424843">
      <d:Descriptor>
         <d:Statement mimeType="application/xml; charset=utf-8">
            <dii:Identifier xmlns:dii="urn:mpeg:mpeg21:2002:01-DII-NS" xsi:schemaLocation="urn:mpeg:mpeg21:2002:01-DII-NS http://standards.iso.org/ittf/PubliclyAvailableStandards/MPEG-21_schema_files/dii/dii.xsd">urn:hdl:2117/424843</dii:Identifier>
         </d:Statement>
      </d:Descriptor>
      <d:Descriptor>
         <d:Statement mimeType="application/xml; charset=utf-8">
            <oai_dc:dc xmlns:oai_dc="http://www.openarchives.org/OAI/2.0/oai_dc/" xmlns:dc="http://purl.org/dc/elements/1.1/" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd">
               <dc:title>Inside-out states of garments</dc:title>
               <dc:title>Technical report IRI-TR-24-01</dc:title>
               <dc:creator>Jiménez Schlegl, Pablo</dc:creator>
               <dc:subject>Àrees temàtiques de la UPC::Informàtica::Robòtica</dc:subject>
               <dc:subject>Deformable object manipulation</dc:subject>
               <dc:subject>Robotic garment manipulation</dc:subject>
               <dc:subject>Inside-out configurations</dc:subject>
               <dc:subject>Cloth state representation</dc:subject>
               <dc:subject>Classificació INSPEC::Automation::Robots</dc:subject>
               <dc:description>Cloth items such as garments can be potentially in infinite different configurations, due to their deformability. For the practical purposes of manipulation, however, it is possible to discretize the space of deformations into a set of equivalent states. This is pursued in the present work, where the complex problem of recovering the canonical configuration of partially (or completely) reversed garments is addressed. Without claiming to solve the hard perceptual and manipulative challenges of this type of task –which are also thoroughly described–, this work should rather be viewed as a pioneering effort to formalize the high-level strategies that aim at solving this canonical configuration-recovering task.</dc:description>
               <dc:description>Cloth items such as garments can be potentially in infinite different configurations, due to their deformability. For the practical purposes of manipulation, however, it is possible to discretize the space of deformations into a set of equivalent states. This is pursued in the present work, where the complex problem of recovering the canonical configuration of partially (or completely) reversed garments is addressed. Without claiming to solve the hard perceptual and manipulative challenges of this type of task –which are also thoroughly described–, this work should rather be viewed as a pioneering effort to formalize the high-level strategies that aim at solving this canonical configuration-recovering task.</dc:description>
               <dc:description>This work has been partially funded by the European Union Horizon 2020 Programme under grant agreement no. 741930 (CLOTHILDE), Project ROB-IN PLEC2021-007859 funded by MCIN/ AEI /10.13039/501100011033 and by the ”European Union NextGenerationEU/PRTR” and Project CHLOE-Graph PID2020-118649RB-I00 funded by MCIN/AEI /10.13039/501100011033</dc:description>
               <dc:description>Preprint</dc:description>
               <dc:date>2024-02-28</dc:date>
               <dc:type>External research report</dc:type>
               <dc:relation>IRI-TR-24-01</dc:relation>
               <dc:relation>info:eu-repo/grantAgreement/EC/H2020/741930/EU/CLOTH manIpulation Learning from DEmonstrations/CLOTHILDE</dc:relation>
               <dc:relation>info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2017-2020/PLEC2021-007859/ES/ROB-IN: Robots para la asistencia continua y personalizada capaces de explicar-se a si mismos/</dc:relation>
               <dc:relation>info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2017-2020/PID2020-118649RB-I00/ES/AGARRE, REPRESENTACION Y PLANIFICACION DE ACCIONES CON OBJETOS TIPO TELA/</dc:relation>
               <dc:rights>http://creativecommons.org/licenses/by-nc-nd/4.0/</dc:rights>
               <dc:rights>Open Access</dc:rights>
               <dc:rights>Attribution-NonCommercial-NoDerivatives 4.0 International</dc:rights>
            </oai_dc:dc>
         </d:Statement>
      </d:Descriptor>
   </d:Item>
</d:DIDL></metadata></record></GetRecord></OAI-PMH>