Abstract

The Probabilistic Latent Semantic Analysis has been related with the Singular Value Decomposition. Several problems occur when this comparative is done. Data class restrictions and the existence of several local optima mask the relation, being a formal analogy without any real significance. Moreover, the computational difficulty in terms of time and memory limits the technique applicability. In this work, we use the Nonnegative Matrix Factorization with the Kullback—Leibler divergence to prove, when the number of model components is enough and a limit condition is reached, that the Singular Value Decomposition and the Probabilistic Latent Semantic Analysis empirical distributions are arbitrary close. Under such conditions, the Nonnegative Matrix Factorization and the Probabilistic Latent Semantic Analysis equality is obtained. With this result, the Singular Value Decomposition of every nonnegative entries matrix converges to the general case Probabilistic Latent Semantic Analysis results and constitutes the unique probabilistic image. Moreover, a faster algorithm for the Probabilistic Latent Semantic Analysis is provided.

Details

Title
On the Probabilistic Latent Semantic Analysis Generalization as the Singular Value Decomposition Probabilistic Image
Author
Vinué, Pau Figuera 1 ; Bringas, Pablo García 1 

 University of Deusto Unibertsitate Etorb., Faculty of Engineering, Bizkaia, Spain (GRID:grid.14724.34) (ISNI:0000 0001 0941 7046) 
Pages
286-296
Publication year
2020
Publication date
Jun 2020
Publisher
Springer Nature B.V.
ISSN
15387887
e-ISSN
22141766
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2911680915
Copyright
© The Authors 2020. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.