2023
DOI: 10.1109/access.2023.3293105
|View full text |Cite
|
Sign up to set email alerts
|

A Feasible and Explainable Network Traffic Classifier Utilizing DistilBERT

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(11 citation statements)
references
References 50 publications
0
11
0
Order By: Relevance
“…In recent years, there has been a surge in research centered on transformer architectures characterized by self-attention and multi-headed attention mechanisms. Transformerstructured models mainly utilize the BERT model, which has proven to show strong performance in the NLP field, but recently, research has also been conducted using the masked autoencoder (MAE), which is used in the CV field [33][34][35].…”
Section: Encrypted Traffic Classificationmentioning
confidence: 99%
See 4 more Smart Citations
“…In recent years, there has been a surge in research centered on transformer architectures characterized by self-attention and multi-headed attention mechanisms. Transformerstructured models mainly utilize the BERT model, which has proven to show strong performance in the NLP field, but recently, research has also been conducted using the masked autoencoder (MAE), which is used in the CV field [33][34][35].…”
Section: Encrypted Traffic Classificationmentioning
confidence: 99%
“…In [23], similar to [22], a pre-trained BERT model and a bidirectional LSTM are applied together, with an accuracy of about 99%. In [33], the authors utilize DistilBERT to perform encrypted traffic classification research. They introduce comparative learning to enhance classification speed without degrading performance.…”
Section: Encrypted Traffic Classificationmentioning
confidence: 99%
See 3 more Smart Citations