<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet type="text/xsl" href="static/style.xsl"?><OAI-PMH xmlns="http://www.openarchives.org/OAI/2.0/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/ http://www.openarchives.org/OAI/2.0/OAI-PMH.xsd"><responseDate>2026-04-18T07:33:47Z</responseDate><request verb="GetRecord" identifier="oai:www.recercat.cat:2117/449440" metadataPrefix="oai_dc">https://recercat.cat/oai/request</request><GetRecord><record><header><identifier>oai:recercat.cat:2117/449440</identifier><datestamp>2026-01-16T04:10:54Z</datestamp><setSpec>com_2072_1033</setSpec><setSpec>col_2072_452950</setSpec></header><metadata><oai_dc:dc xmlns:oai_dc="http://www.openarchives.org/OAI/2.0/oai_dc/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:doc="http://www.lyncode.com/xoai" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd">
   <dc:title>A Single-neuron-per-class Readout for image-encoded sensor time series</dc:title>
   <dc:creator>Bernal Casas, David</dc:creator>
   <dc:creator>Gallego Vila, Jaime</dc:creator>
   <dc:contributor>Universitat Politècnica de Catalunya. Departament de Ciències de la Computació</dc:contributor>
   <dc:contributor>Universitat Politècnica de Catalunya. ViRVIG - Grup de Recerca en Visualització, Realitat Virtual i Interacció Gràfica</dc:contributor>
   <dc:subject>Àrees temàtiques de la UPC::Informàtica</dc:subject>
   <dc:subject>End-to-end learning</dc:subject>
   <dc:subject>Single-neuron-per-class readout</dc:subject>
   <dc:subject>Neuromorphic computing</dc:subject>
   <dc:subject>Image-encoded time series</dc:subject>
   <dc:subject>Neural networks</dc:subject>
   <dc:subject>Spiking neural networks</dc:subject>
   <dc:subject>Resonate-and-fire (RAF) neuron</dc:subject>
   <dc:subject>Noisy environments</dc:subject>
   <dc:description>We introduce an ultra-compact, single-neuron-per-class end-to-end readout for binary classification of noisy, image-encoded sensor time series. The approach compares a linear single-unit perceptron (E2E-MLP-1) with a resonate-and-fire (RAF) neuron (E2E-RAF-1), which merges feature selection and decision-making in a single block. Beyond empirical evaluation, we provide a mathematical analysis of the RAF readout: starting from its subthreshold ordinary differential equation, we derive the transfer function H(j¿), characterize the frequency response, and relate the output signal-to-noise ratio (SNR) to |H(j¿)|2 and the noise power spectral density Sn(¿)¿¿a (brown, pink, and blue noise). We present a stable discrete-time implementation compatible with surrogate gradient training and discuss the associated stability constraints. As a case study, we classify walk-in-place (WIP) in a virtual reality (VR) environment, a vision-based motion encoding (72 × 56 grayscale) derived from 3D trajectories, comprising 44,084 samples from 15 participants. On clean data, both single-neuron-per-class models approach ceiling accuracy. At the same time, under colored noise, the RAF readout yields consistent gains (typically +5–8% absolute accuracy at medium/high perturbations), indicative of intrinsic band-selective filtering induced by resonance. With ~8 k parameters and sub-2 ms inference on commodity graphical processing units (GPUs), the RAF readout provides a mathematically grounded, robust, and efficient alternative for stochastic signal processing across domains, with virtual reality locomotion used here as an illustrative validation.</dc:description>
   <dc:description>Peer Reviewed</dc:description>
   <dc:description>Postprint (published version)</dc:description>
   <dc:date>2025-12-05</dc:date>
   <dc:type>Article</dc:type>
   <dc:identifier>Bernal-Casas, D.; Gallego, J. A Single-neuron-per-class Readout for image-encoded sensor time series. «Mathematics», 5 Desembre 2025, vol. 13, núm. 24, article 3893.</dc:identifier>
   <dc:identifier>2227-7390</dc:identifier>
   <dc:identifier>https://hdl.handle.net/2117/449440</dc:identifier>
   <dc:identifier>10.3390/math13243893</dc:identifier>
   <dc:language>eng</dc:language>
   <dc:relation>https://www.mdpi.com/2227-7390/13/24/3893</dc:relation>
   <dc:rights>http://creativecommons.org/licenses/by/4.0/</dc:rights>
   <dc:rights>Open Access</dc:rights>
   <dc:rights>Attribution 4.0 International</dc:rights>
   <dc:format>application/pdf</dc:format>
   <dc:publisher>Multidisciplinary Digital Publishing Institute (MDPI)</dc:publisher>
</oai_dc:dc></metadata></record></GetRecord></OAI-PMH>