2021
DOI: 10.1155/2021/5399816
|View full text |Cite
|
Sign up to set email alerts
|

Deep Nearest Neighbor Website Fingerprinting Attack Technology

Abstract: By website fingerprinting (WF) technologies, local listeners are enabled to track the specific website visited by users through an investigation of the encrypted traffic between the users and the Tor network entry node. The current triplet fingerprinting (TF) technique proved the possibility of small sample WF attacks. Previous research methods only concentrate on extracting the overall features of website traffic while ignoring the importance of website local fingerprinting characteristics for small sample WF… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 30 publications
0
2
0
Order By: Relevance
“…Shen et al [43] proposed BurNet, a fine-grained WF method using CNNs which took unidirectional burst sequences as input, sophisticated architecture was designed to improve classification accuracy and reduce time complexity in training. Guo et al [44] proposed a deep nearest neighbor smallsample website fingerprinting (DNNF) attack, first deep local fingerprinting features of websites were extracted via CNN, then webpage prediction was carried out by the k-nearest neighbor (kNN) classifier. Lu et al [45] proposed the Graph Attention Pooling Network for fine-grained website fingerprinting (GAP-WF), introduced the trace graph to describe the contextual relationship between flows in webpage loading, utilized the Graph Neural Networks (GNNs) to learn the intra-flow and inter-flow features.…”
Section: A Approachesmentioning
confidence: 99%
See 1 more Smart Citation
“…Shen et al [43] proposed BurNet, a fine-grained WF method using CNNs which took unidirectional burst sequences as input, sophisticated architecture was designed to improve classification accuracy and reduce time complexity in training. Guo et al [44] proposed a deep nearest neighbor smallsample website fingerprinting (DNNF) attack, first deep local fingerprinting features of websites were extracted via CNN, then webpage prediction was carried out by the k-nearest neighbor (kNN) classifier. Lu et al [45] proposed the Graph Attention Pooling Network for fine-grained website fingerprinting (GAP-WF), introduced the trace graph to describe the contextual relationship between flows in webpage loading, utilized the Graph Neural Networks (GNNs) to learn the intra-flow and inter-flow features.…”
Section: A Approachesmentioning
confidence: 99%
“…When the size of the world was significantly increased, its performance degraded significantly to 30% precision and 70% recall. [26] SDAE, CNN, LSTM DF [27] CNN GRU and ResNet [28] GRU, ResNet-50 p-FP [29] MLP, CNN Cache-based WF [30], [31] CNN, LSTM Var-CNN [32] ResNet-18 Tik-Tok [34] DF 2ch-TCN [35] CNN Realistic WF [109] CNN, LSTM Multi-session WF [39] LSTM Side-channel informationbased WF [41] CNN, LSTM BurNet [43] CNN DNNF [44] CNN GAP-WF [45] GNN Cross-trace WF [46] DF DNN with Blind adversarial training [2] DF DNN with Tripod data augmentation [47] DF, Var-CNN, ResNet-18, ResNet-34, VGG-16, VGG-19 DNN with HDA data augmentation [48] Var-CNN, ResNet-34 Microarchitecture-based WF [49] 1-D CNN BAPM [110] CNN, Self-attention FDF [52] CNN, FC, Self-attention snWF [53] CNN WFD [111] 1-D ResNets DNN with Minipatch adversarial training [54] DF DNN with Bionic data augmentation [55] Var-CNN Semi-supervised learning GANDaLF [40] GAN PAS [114] DCNN, DF, AWF Transfer learning AF [42] Domain adversarial network TLFA [51] CNN, MLP Metric-learning TF [33] Triplet networks CPWF [38] CNN CNN-BiLSTM-based Siamese networks [50] Siamese networks, CNN, LSTM Online WF [56] TF Meta-learning MBL [57] CNN…”
Section: Performancementioning
confidence: 99%