Content area

Abstract

Performance modeling is an integral part of the research process for computational scientists. It enables them to understand how different factors contribute to the final runtime of an application. This understanding is crucial to developing efficient scientific applications and simulations. While important, performance modeling is difficult as there are a large number of factors that may contribute to final performance. Factors such as the algorithm, problem size, implementation, architecture, and systems software stack all impact performance in an often complex relationship. Analytical models can be employed to study these causal variables and performance, however, they are difficult to scale up to a large number of input variables. Additionally, the relationship between the causal variables and performance may be unknown or complex, making it challenging to derive an analytical model. Fortunately, machine learning (ML) can help address these challenges as ML algorithms excel at modeling unknown and complex relationships. Furthermore, ML-based performance models can handle a large number of input variables, making them ideal for modeling complex scientific codes. By training ML models on historical performance data, computational scientists can develop accurate models that can predict the performance of new applications and simulations under different scenarios. However, current ML-based modeling approaches are limited to modeling one or two sources of performance data, such as hardware counters or application features. This limitation prevents models from making use of all available causal variables that may impact performance. This thesis introduces novel approaches to modeling performance that can make use of all available data sources. Additionally, it introduces performance latent spaces that can be used to model various output metrics, such as runtime or energy consumption, in a unified manner. Finally, a method to integrate these performance models into large language models is introduced to enable modeling and improving the performance of code.

Details

1010268
Business indexing term
Title
On Learning Behaviors of Parallel Code and Systems Across Modalities
Author
Number of pages
245
Publication year
2025
Degree date
2025
School code
0117
Source
DAI-B 86/12(E), Dissertation Abstracts International
ISBN
9798286444472
Committee member
Sussman, Alan; Daumé, Hal; Larsson, Johan; Schulz, Martin; Gamblin, Todd
University/institution
University of Maryland, College Park
Department
Computer Science
University location
United States -- Maryland
Degree
Ph.D.
Source type
Dissertation or Thesis
Language
English
Document type
Dissertation/Thesis
Dissertation/thesis number
31939224
ProQuest document ID
3225337893
Document URL
https://www.proquest.com/dissertations-theses/on-learning-behaviors-parallel-code-systems/docview/3225337893/se-2?accountid=208611
Copyright
Database copyright ProQuest LLC; ProQuest does not claim copyright in the individual underlying works.
Database
ProQuest One Academic