Content area

Abstract

This research explores methods for accelerating graph algorithms using theoretical insights into tensors, novel computer architecture, and efficient probabilistic sampling. We first extend a well-known matrix theorem about Kronecker products to tensor Kronecker products. Our new theorem shows that the dominant tensor eigenvector of a Kronecker product of tensors decouples into an outer product of two vectors. When this tensor eigenvector reflects a measure of similarity between two graphs, this enables a new network alignment heuristic that increases both edges and triangles aligned while running around three orders of magnitude faster. In a collaborative effort, we contribute to the evaluation of a fine-grain parallel computer architecture designed to handle irregular graph workloads. The key features of this proposed architecture are threads that exist for as little as ten cycles, yet still enable efficient overall performance, combined with massive memory bandwidth. Our primary role in the collaboration involves creating projections of machine and network performance based on small-scale studies. We find that the proposed architecture is often 10-100 times faster at the same power level as existing machines and sometimes up to 1000 times faster. Finally, we address how we can improve influence maximization algorithms by accelerating a core primitive, the probabilistic breadth-first search. This routine is fundamental to many influence maximization strategies as it simulates a single influence propagation scenario or the reverse process. A novel use of alias tables and Poisson binomial sampling enables us to efficiently sample weighted neighbor lists to make this efficient. In our largest synthetic experiments, this approach yields at least a 10-fold improvement over coin flipping for arrays with low enough number of heads expected. When implemented in a full influence maximization algorithm, Poisson Binomial sampling can improve the end-to-end runtime by a factor of six for commonly tested parameters.

Details

1010268
Title
Scaling Network Analysis With Tensors, Sampling, and Architecture
Number of pages
150
Publication year
2025
Degree date
2025
School code
0183
Source
DAI-B 87/2(E), Dissertation Abstracts International
ISBN
9798291537183
Committee member
Grama, Ananth Y.; Drineas, Petros; Quanrud, Kent R.
University/institution
Purdue University
University location
United States -- Indiana
Degree
Ph.D.
Source type
Dissertation or Thesis
Language
English
Document type
Dissertation/Thesis
Dissertation/thesis number
32264706
ProQuest document ID
3260932552
Document URL
https://www.proquest.com/dissertations-theses/scaling-network-analysis-with-tensors-sampling/docview/3260932552/se-2?accountid=208611
Copyright
Database copyright ProQuest LLC; ProQuest does not claim copyright in the individual underlying works.
Database
ProQuest One Academic