Proceedings of the 34th ACM SIGPLAN Conference on Programming Language Design and Implementation 2013
DOI: 10.1145/2491956.2462181
|View full text |Cite
|
Sign up to set email alerts
|

Smat

Abstract: Sparse Matrix Vector multiplication (SpMV) is an important kernel in both traditional high performance computing and emerging data-intensive applications. By far, SpMV libraries are optimized by either application-specific or architecture-specific approaches, making the libraries become too complicated to be used extensively in real applications. In this work we develop a Sparse Matrixvector multiplication Auto-Tuning system (SMAT) to bridge the gap between specific optimizations and general-purpose usage. S-M… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 97 publications
(11 citation statements)
references
References 34 publications
0
11
0
Order By: Relevance
“…Furthermore, choosing the optimal format by running the available options first can result in significant overheads. ML and Artificial Intelligence (AI) have been successful in various optimization tasks ranging from code optimization to model selection [18], including the task of selecting the optimal sparse matrix storage format [13], [16]. Adopting a ML model has the potential to offer an accurate and low-overhead solution to the problem of automatic format selection, eliminating any requirement for manual format selection input from the user enabling applications to remain optimal across the different types of hardware and sparsity patterns for any operation of interest.…”
Section: A Motivationmentioning
confidence: 99%
See 2 more Smart Citations
“…Furthermore, choosing the optimal format by running the available options first can result in significant overheads. ML and Artificial Intelligence (AI) have been successful in various optimization tasks ranging from code optimization to model selection [18], including the task of selecting the optimal sparse matrix storage format [13], [16]. Adopting a ML model has the potential to offer an accurate and low-overhead solution to the problem of automatic format selection, eliminating any requirement for manual format selection input from the user enabling applications to remain optimal across the different types of hardware and sparsity patterns for any operation of interest.…”
Section: A Motivationmentioning
confidence: 99%
“…Similarly, Sedeghati et al [26] used a Decision Tree based classifier to choose from five available formats resulting in 81% accuracy on GPUs. On the other hand, Li et al [13] used an input adaptive SpMV auto-tuner based on ruleset classification that maintains a confidence value for each test sample. If the prediction of the classifier is below the defined threshold, the auto-tuner selects the optimal format and SpMV kernel otherwise switches to a run-first approach to make the decision, reporting accuracy of up to 85%.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…is phenomenon has been observed from sparse matrices, where di erent sparse formats behave quite di erently on diverse input matrices [87,119,139,163]. As mentioned in the work [12], an outdated algorithm cannot well re ect the current status of an application.…”
Section: Requirements For a Benchmark Suitementioning
confidence: 99%
“…Our benchmark suite consists of a set of reference implementations from various tensor applications, each of which show di erent computational behavior. Much like two-dimensional sparse matrices, the data layout, or the data structure used to hold a sparse tensor, has a signi cant impact on performance and storage [43,54]. It also has a signi cant impact on how the control ow for a given operation must be executed and its memory footprints.…”
Section: Introductionmentioning
confidence: 99%