2021 IEEE Wireless Communications and Networking Conference (WCNC) 2021
DOI: 10.1109/wcnc49053.2021.9417340
|View full text |Cite
|
Sign up to set email alerts
|

Compressed Network in Network Models for Traffic Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 12 publications
0
4
0
Order By: Relevance
“…Other approaches include model optimization using compression techniques [24]. In [43], Lu et al propose a compressed Network In Network (NIN) model for TC. They design a step-wise pruning and Knowledge Distillation (KD) strategy to train the compressed model, aiming to reduce storage and computing resources.…”
Section: Optimized Dnn For Tcmentioning
confidence: 99%
See 2 more Smart Citations
“…Other approaches include model optimization using compression techniques [24]. In [43], Lu et al propose a compressed Network In Network (NIN) model for TC. They design a step-wise pruning and Knowledge Distillation (KD) strategy to train the compressed model, aiming to reduce storage and computing resources.…”
Section: Optimized Dnn For Tcmentioning
confidence: 99%
“…The NIN basic architecture is optimized in a follow-up work using self-distillation and KD for TC [44]. The model is further optimized with pruning to remove redundant filters and employs knowledge distillation to train compressed models without compromising performance.…”
Section: Optimized Dnn For Tcmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition, the protocol profiling method and automatic rule extraction method are suggested in the study. Later on, Lu, et al [30] proposed a network-in-network (NIN) model to reduce the need for computing and storage resources. Gradual pruning and knowledge distillation (KD) compression were used to train the model.…”
Section: Related Workmentioning
confidence: 99%