Proceedings of the 30th ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Enginee 2022
DOI: 10.1145/3540250.3549175
|View full text |Cite
|
Sign up to set email alerts
|

AutoPruner: transformer-based call graph pruning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(1 citation statement)
references
References 43 publications
0
1
0
Order By: Relevance
“…The use of backpropagation gradients through time, which can be computationally expensive, is not necessary with randomization-based learning techniques like echo state networks (ESN) [79]. Additionally, GNN and CNN pruning strategies can reduce the amount of parameters and computations needed, resulting in quicker inference times [80,81].…”
Section: Awanmentioning
confidence: 99%
“…The use of backpropagation gradients through time, which can be computationally expensive, is not necessary with randomization-based learning techniques like echo state networks (ESN) [79]. Additionally, GNN and CNN pruning strategies can reduce the amount of parameters and computations needed, resulting in quicker inference times [80,81].…”
Section: Awanmentioning
confidence: 99%