Para acceder a los documentos con el texto completo, por favor, siga el siguiente enlace: http://hdl.handle.net/10230/934

Worst-case bounds for the logarithmic loss of predictors
Cesa Bianchi, Nicolò; Lugosi, Gábor
Universitat Pompeu Fabra. Departament d'Economia i Empresa
We investigate on-line prediction of individual sequences. Given a class of predictors, the goal is to predict as well as the best predictor in the class, where the loss is measured by the self information (logarithmic) loss function. The excess loss (regret) is closely related to the redundancy of the associated lossless universal code. Using Shtarkov's theorem and tools from empirical process theory, we prove a general upper bound on the best possible (minimax) regret. The bound depends on certain metric properties of the class of predictors. We apply the bound to both parametric and nonparametric classes ofpredictors. Finally, we point out a suboptimal behavior of the popular Bayesian weighted average algorithm.
15-09-2005
Statistics, Econometrics and Quantitative Methods
universal prediction
universal coding
empirical processes
on-line learning
metric entropy
L'accés als continguts d'aquest document queda condicionat a l'acceptació de les condicions d'ús establertes per la següent llicència Creative Commons
http://creativecommons.org/licenses/by-nc-nd/3.0/es/
Documento de trabajo
         

Mostrar el registro completo del ítem