Abstract

Machine learning influences numerous aspects of modern society, empowers new technologies, from Alphago to ChatGPT, and increasingly materializes in consumer products such as smartphones and self-driving cars. Despite the vital role and broad applications of artificial neural networks, we lack systematic approaches, such as network science, to understand their underlying mechanism. The difficulty is rooted in many possible model configurations, each with different hyper-parameters and weighted architectures determined by noisy data. We bridge the gap by developing a mathematical framework that maps the neural network’s performance to the network characters of the line graph governed by the edge dynamics of stochastic gradient descent differential equations. This framework enables us to derive a neural capacitance metric to universally capture a model’s generalization capability on a downstream task and predict model performance using only early training results. The numerical results on 17 pre-trained ImageNet models across five benchmark datasets and one NAS benchmark indicate that our neural capacitance metric is a powerful indicator for model selection based only on early training results and is more efficient than state-of-the-art methods.

Understanding of artificial neural networks function, and their ability to effectively solve specific tasks, still require more rigorous analytical background. Using network science and dynamical systems tools, the authors develop a framework for predicting the performance of artificial neural networks

Details

Title
Network properties determine neural network performance
Author
Jiang, Chunheng 1 ; Huang, Zhenhan 1   VIAFID ORCID Logo  ; Pedapati, Tejaswini 2 ; Chen, Pin-Yu 2   VIAFID ORCID Logo  ; Sun, Yizhou 3   VIAFID ORCID Logo  ; Gao, Jianxi 1   VIAFID ORCID Logo 

 Rensselaer Polytechnic Institute, Network Science and Technology Center, Troy, USA (GRID:grid.33647.35) (ISNI:0000 0001 2160 9198); Rensselaer Polytechnic Institute, Department of Computer Science, Troy, USA (GRID:grid.33647.35) (ISNI:0000 0001 2160 9198) 
 IBM Thomas J. Watson Research Center, Yorktown Heights, USA (GRID:grid.481554.9) (ISNI:0000 0001 2111 841X) 
 University of California, Department of Computer Science, Los Angeles, USA (GRID:grid.19006.3e) (ISNI:0000 0000 9632 6718) 
Pages
5718
Publication year
2024
Publication date
2024
Publisher
Nature Publishing Group
e-ISSN
20411723
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
3076832703
Copyright
© The Author(s) 2024. corrected publication 2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.