Neural Stochastic Differential Equations for conditional time series generation using the signature Wasserstein -1 metric

Data de publicació

2024-06-21T11:49:20Z

2024-08-09T05:10:11Z

2023-08-10

2024-06-21T11:49:25Z

Resum

(Conditional) generative adversarial networks (GANs) have had great success in recent years, due to their ability to approximate (conditional) distributions over extremely high-dimensional spaces. However, they are highly unstable and computationally expensive to train, especially in the time series setting. Recently, the use of a key object in rough path theory, called the signature of a path, has been proposed. This is able to convert the min–max formulation given by the (conditional) GAN framework into a classical minimization problem. However, this method is extremely costly in terms of memory, which can sometimes become prohibitive. To overcome this, we propose the use of conditional neural stochastic differential equations, designed to have a constant memory cost as a function of depth, being more memory efficient than traditional deep learning architectures. We empirically test the efficiency of our proposed model against other classical approaches, in terms of both memory cost and computational time, and show that it usually outperforms them according to several metrics.

Tipus de document

Article


Versió publicada

Llengua

Anglès

Publicat per

Infopro Digital

Documents relacionats

Reproducció del document publicat a: https://doi.org/10.21314/JCF.2023.005

Journal Of Computational Finance, 2023, vol. 27, num.1, p. 1-23

https://doi.org/10.21314/JCF.2023.005

Citació recomanada

Aquesta citació s'ha generat automàticament.

Drets

(c) Infopro Digital, 2023

Aquest element apareix en la col·lecció o col·leccions següent(s)