2019
DOI: 10.48550/arxiv.1909.05680
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

pForest: In-Network Inference with Random Forests

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
14
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(18 citation statements)
references
References 0 publications
0
14
0
Order By: Relevance
“…This code word is the result of the feature tables lookup. Therefore the number of pipeline stages consumed by a decision tree is independent of the depth of the tree, overcoming previous limitations [12,29].…”
Section: Decision Treementioning
confidence: 99%
See 2 more Smart Citations
“…This code word is the result of the feature tables lookup. Therefore the number of pipeline stages consumed by a decision tree is independent of the depth of the tree, overcoming previous limitations [12,29].…”
Section: Decision Treementioning
confidence: 99%
“…Implementing binary neural networks was explored in N2Net [44] and BaNaNa Split [40], with limited performance benefits. pForest [12], SwitchTree [29] and NERDS [55] explored mapping random forests, with only pForest attempting ASIC implementation. Their methodology is different, encoding each decision tree in separate tables, with a table for every tree level, achieving lower scalability (e.g., depth of 4 in pForest).…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In-device machine learning (ML) offloading: Work that focuses on the programmable data-plane for switches based on Protocol Independent Switch Architecture (PISA) [22,36,38,45] tries to cope with the limited resources that are available for the implementation of ML models. For example [45] adapt several ML models to the match-action table model in P4, while [15] implementing a random forest model to classify traffic. In both cases, the benefits of running the analytics in the data-plane are limited since no immediate packet forwarding decision derives from the analytics.…”
Section: Related Workmentioning
confidence: 99%
“…Researchers have so far demonstrated the offloading of algorithms such as Support Vector Machines (SVM) 1 [59,64,67], Naïve Bayes (NB) [64,67], K-means (KM) [19,59,64,67], Decision Trees (DT) [64,67,68], Random Forest (RF) [7,36,63,65,67,68], and (Binary) Neural Networks (NN) [48,50,54,56,57,59,70]. These works have provided proofs-of-concept and laid the foundation of in-network ML.…”
Section: Introductionmentioning
confidence: 99%