<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet type="text/xsl" href="static/style.xsl"?><OAI-PMH xmlns="http://www.openarchives.org/OAI/2.0/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/ http://www.openarchives.org/OAI/2.0/OAI-PMH.xsd"><responseDate>2026-04-17T03:30:56Z</responseDate><request verb="GetRecord" identifier="oai:www.recercat.cat:2117/449440" metadataPrefix="qdc">https://recercat.cat/oai/request</request><GetRecord><record><header><identifier>oai:recercat.cat:2117/449440</identifier><datestamp>2026-01-16T04:10:54Z</datestamp><setSpec>com_2072_1033</setSpec><setSpec>col_2072_452950</setSpec></header><metadata><qdc:qualifieddc xmlns:qdc="http://dspace.org/qualifieddc/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcterms="http://purl.org/dc/terms/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:doc="http://www.lyncode.com/xoai" xsi:schemaLocation="http://purl.org/dc/elements/1.1/ http://dublincore.org/schemas/xmls/qdc/2006/01/06/dc.xsd http://purl.org/dc/terms/ http://dublincore.org/schemas/xmls/qdc/2006/01/06/dcterms.xsd http://dspace.org/qualifieddc/ http://www.ukoln.ac.uk/metadata/dcmi/xmlschema/qualifieddc.xsd">
   <dc:title>A Single-neuron-per-class Readout for image-encoded sensor time series</dc:title>
   <dc:creator>Bernal Casas, David</dc:creator>
   <dc:creator>Gallego Vila, Jaime</dc:creator>
   <dc:subject>Àrees temàtiques de la UPC::Informàtica</dc:subject>
   <dc:subject>End-to-end learning</dc:subject>
   <dc:subject>Single-neuron-per-class readout</dc:subject>
   <dc:subject>Neuromorphic computing</dc:subject>
   <dc:subject>Image-encoded time series</dc:subject>
   <dc:subject>Neural networks</dc:subject>
   <dc:subject>Spiking neural networks</dc:subject>
   <dc:subject>Resonate-and-fire (RAF) neuron</dc:subject>
   <dc:subject>Noisy environments</dc:subject>
   <dcterms:abstract>We introduce an ultra-compact, single-neuron-per-class end-to-end readout for binary classification of noisy, image-encoded sensor time series. The approach compares a linear single-unit perceptron (E2E-MLP-1) with a resonate-and-fire (RAF) neuron (E2E-RAF-1), which merges feature selection and decision-making in a single block. Beyond empirical evaluation, we provide a mathematical analysis of the RAF readout: starting from its subthreshold ordinary differential equation, we derive the transfer function H(j¿), characterize the frequency response, and relate the output signal-to-noise ratio (SNR) to |H(j¿)|2 and the noise power spectral density Sn(¿)¿¿a (brown, pink, and blue noise). We present a stable discrete-time implementation compatible with surrogate gradient training and discuss the associated stability constraints. As a case study, we classify walk-in-place (WIP) in a virtual reality (VR) environment, a vision-based motion encoding (72 × 56 grayscale) derived from 3D trajectories, comprising 44,084 samples from 15 participants. On clean data, both single-neuron-per-class models approach ceiling accuracy. At the same time, under colored noise, the RAF readout yields consistent gains (typically +5–8% absolute accuracy at medium/high perturbations), indicative of intrinsic band-selective filtering induced by resonance. With ~8 k parameters and sub-2 ms inference on commodity graphical processing units (GPUs), the RAF readout provides a mathematically grounded, robust, and efficient alternative for stochastic signal processing across domains, with virtual reality locomotion used here as an illustrative validation.</dcterms:abstract>
   <dcterms:abstract>Peer Reviewed</dcterms:abstract>
   <dcterms:abstract>Postprint (published version)</dcterms:abstract>
   <dcterms:issued>2025-12-05</dcterms:issued>
   <dc:type>Article</dc:type>
   <dc:relation>https://www.mdpi.com/2227-7390/13/24/3893</dc:relation>
   <dc:rights>http://creativecommons.org/licenses/by/4.0/</dc:rights>
   <dc:rights>Open Access</dc:rights>
   <dc:rights>Attribution 4.0 International</dc:rights>
   <dc:publisher>Multidisciplinary Digital Publishing Institute (MDPI)</dc:publisher>
</qdc:qualifieddc></metadata></record></GetRecord></OAI-PMH>