2022 6th International Conference on Robotics and Automation Sciences (ICRAS) 2022
DOI: 10.1109/icras55217.2022.9842039
|View full text |Cite
|
Sign up to set email alerts
|

Data-Free Knowledge Distillation for Privacy-Preserving Efficient UAV Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 13 publications
0
3
0
Order By: Relevance
“…To prevent direct contact with private information, TS transfers the complex neural network to lightweight networks with the consideration of preserving privacy. The new lightweight network is deployed in UAV networking to assist UAVs in recognizing targets [31].…”
Section: Knowledge Distillationmentioning
confidence: 99%
“…To prevent direct contact with private information, TS transfers the complex neural network to lightweight networks with the consideration of preserving privacy. The new lightweight network is deployed in UAV networking to assist UAVs in recognizing targets [31].…”
Section: Knowledge Distillationmentioning
confidence: 99%
“…Several works have already used KD to obtain lightweight models suitable for UAVs. For example, Li et al [16] have applied this technique for video saliency estimation, while Liu et al [17], Yu [18], Ding et al [19], and Luo et al [20] used it for object detection, object recognition, action recognition, and UAV delivery, respectively. However, to our knowledge, no work has investigated knowledge distillation to produce efficient and accurate models tailored for UAVs in the context of weed mapping.…”
Section: B Knowledge Distillationmentioning
confidence: 99%
“…One of the most promising approaches for model compression and domain adaptation is Knowledge Distillation (KD) [6], a machine learning paradigm that relies on the transfer of knowledge from a large Teacher network to a less complex Student model that can be implemented on the edge. In other application domains, KD has demonstrated effective results in maintaining the performance of the Teacher network, while facilitating scalability [7] and preservation of privacy [8].…”
Section: Introductionmentioning
confidence: 99%