Content area

Abstract

This thesis introduces a novel class of physics-inspired neural networks—Physics-based Neural Deformable Models(PNDMs)—that integrate traditional physics-based deformable models with modern deep learning to achieve interpretable and flexible 3D shape representations. While classical deformable models offer semantic clarity through parametric primitives, they suffer from limited geometric flexibility and dependence on handcrafted initializations. In contrast, PNDMs overcome these limitations by learning parameter functions that generalize primitive geometry, employing diffeomorphic mappings to preserve topology, and leveraging external forces for robust training.We extend this paradigm in DeFormer, a transformer-based framework that hierarchically disentangles global and local shape deformations, and in LEPARD, which enables 3D articulated part discovery directly from 2D supervision. Finally, we demonstrate the application of our methods in photorealistic avatar reconstruction, including the LUCAS system for layered codec avatars. Together, these contributions bridge interpretable physics-based modeling with scalable neural architectures for shape abstraction, segmentation, and generation.

Details

Title
Physics-Based Neural Deformable Models
Author
Liu, Di
Publication year
2025
Publisher
ProQuest Dissertations & Theses
ISBN
9798293845903
Source type
Dissertation or Thesis
Language of publication
English
ProQuest document ID
3251632225
Copyright
Database copyright ProQuest LLC; ProQuest does not claim copyright in the individual underlying works.