HMM for short independent sequences: Multiple sequence Baum-Welch application

Altres autors/es

Universitat Politècnica de Catalunya. Departament de Teoria del Senyal i Comunicacions

Universitat Politècnica de Catalunya. SPCOM - Processament del Senyal i Comunicacions

Data de publicació

2025-10-30



Resum

Scientific document for advising on the programming of Hidden Markov Model processes with large-scale short-sequence datasets.


In the classical setting, the training of a Hidden Markov Model (HMM) typically relies on a single, sufficiently long observation sequence that can be regarded as representative of the underlying stochastic process. In this context, the Expectation Maximization (EM) algorithm is applied in its specialized form for HMMs, namely the Baum Welch algorithm, which has been extensively employed in applications such as speech recognition. The objective of this work is to present pseudocode formulations for both the training and decoding procedures of HMMs in a different scenario, where the available data consist of multiple independent temporal sequences generated by the same model, each of relatively short duration, i.e., containing only a limited number of samples. Special emphasis is placed on the relevance of this formulation to longitudinal studies in population health, where datasets are naturally structured as collections of short trajectories across individuals with point data at follow up.


Preprint

Tipus de document

External research report

Llengua

Anglès

Citació recomanada

Aquesta citació s'ha generat automàticament.

Drets

http://creativecommons.org/licenses/by-nc-sa/4.0/

Open Access

Aquest element apareix en la col·lecció o col·leccions següent(s)

E-prints [72399]