Content area

Abstract

Disentanglement is a highly desirable property of representation owing to its similarity to human understanding and reasoning. Many works achieve disentanglement upon information bottlenecks. Despite their elegant mathematical foundations, the IB branch usually exhibits lower performance. In order to provide an insight into the problem, we develop an annealing test to calculate the information freezing point (IFP), which is a transition state to freeze information into the latent variables. We also explore this clue or inductive bias for separating the entangled factors according to the differences in the IFP distributions. We found the existing approaches suffer from the information diffusion problem, according to which the increased information diffuses in all latent variables. Based on this insight, we propose a novel disentanglement framework, termed the distilling entangled factor (DEFT), to address the information diffusion problem by scaling backward information. DEFT applies a multistage training strategy, including multigroup encoders with different learning rates and piecewise pressure, to disentangle the factors stage by stage. We evaluate DEFT on three variants of dSprites and SmallNORB, which shows low-variance and high-level disentanglement scores. Furthermore, the experiment under the correlative factors demonstrates incapable of TC-based approaches. DEFT also exhibits a competitive performance in the unsupervised setting.

Details

Title
DEFT: distilling entangled factors by preventing information diffusion
Author
Jiantao 1 ; Wang, Lin 1   VIAFID ORCID Logo  ; Yang, Bo 1 ; Li, Fanqi 2 ; Liu, Chunxiuzi 2 ; Zhou, Jin 2 

 University of Jinan, Shandong Provincial Key Laboratory of Network Based Intelligent Computing, Jinan, China (GRID:grid.454761.5) (ISNI:0000 0004 1759 9355); Quancheng Shandong Laboratory, Jinan, China (GRID:grid.454761.5) 
 University of Jinan, Shandong Provincial Key Laboratory of Network Based Intelligent Computing, Jinan, China (GRID:grid.454761.5) (ISNI:0000 0004 1759 9355) 
Pages
2275-2295
Publication year
2022
Publication date
Jun 2022
Publisher
Springer Nature B.V.
ISSN
08856125
e-ISSN
15730565
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2671804777
Copyright
© The Author(s), under exclusive licence to Springer Science+Business Media LLC, part of Springer Nature 2022.