Abstract

The nested distance builds on the Wasserstein distance to quantify the difference of stochastic processes, including also the evolution of information modelled by filtrations. The Sinkhorn divergence is a relaxation of the Wasserstein distance, which can be computed considerably faster. For this reason we employ the Sinkhorn divergence and take advantage of the related (fixed point) iteration algorithm. Furthermore, we investigate the transition of the entropy throughout the stages of the stochastic process and provide an entropy-regularized nested distance formulation, including a characterization of its dual. Numerical experiments affirm the computational advantage and supremacy.

Details

Title
The nested Sinkhorn divergence to learn the nested distance
Author
Pichler Alois 1   VIAFID ORCID Logo  ; Weinhardt, Michael 1 

 University of Technology, Chemnitz, Faculty of Mathematics, Chemnitz, Germany 
Pages
269-293
Publication year
2022
Publication date
Jun 2022
Publisher
Springer Nature B.V.
ISSN
1619697X
e-ISSN
16196988
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2666071794
Copyright
© The Author(s) 2021. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.