IEEE INFOCOM 2020 - IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS) 2020
DOI: 10.1109/infocomwkshps50562.2020.9162704
|View full text |Cite
|
Sign up to set email alerts
|

Opening the Deep Pandora Box: Explainable Traffic Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
14
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
1

Relationship

2
5

Authors

Journals

citations
Cited by 19 publications
(14 citation statements)
references
References 6 publications
0
14
0
Order By: Relevance
“…Finally, deployability issues especially includes aspects related to model inference, such as auditing/explainability of classification decisions for the non experts (which has practical relevance since unlike decision trees, DL models have no direct explanation [60], [61]) as well as more precise assessment of computational costs (e.g., to ensure the model execution is within the CPU/energy budget [19]), an aspect that this paper briefly covers but does not fully elucidate.…”
Section: Summary and Discussionmentioning
confidence: 99%
“…Finally, deployability issues especially includes aspects related to model inference, such as auditing/explainability of classification decisions for the non experts (which has practical relevance since unlike decision trees, DL models have no direct explanation [60], [61]) as well as more precise assessment of computational costs (e.g., to ensure the model execution is within the CPU/energy budget [19]), an aspect that this paper briefly covers but does not fully elucidate.…”
Section: Summary and Discussionmentioning
confidence: 99%
“…Traffic classification can be formulated as a Multivariate Time Series (MTS) classification task, where the input is packet-level data (e.g., packet size, direction) related to a single flow 1 , and the output is an application label. The state-of-the-art approaches [1,8,28,29,32,38,48] adopt deep learning methods (vanilla CNN and/or LSTM) with heavyweight architectures (from 1M to 2M weights), and do not discuss the impact of their model size on the accuracy and inference time. Some of them [8,32,38] propose classifiers along with an explainability method to support their predictions.…”
Section: Related Workmentioning
confidence: 99%
“…The state-of-the-art approaches [1,8,28,29,32,38,48] adopt deep learning methods (vanilla CNN and/or LSTM) with heavyweight architectures (from 1M to 2M weights), and do not discuss the impact of their model size on the accuracy and inference time. Some of them [8,32,38] propose classifiers along with an explainability method to support their predictions. However, the post hoc explainability methods employed, such as SHAP [30], cannot provide perfectly faithful explanations with respect to the original model [40].…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations