2022 International Joint Conference on Neural Networks (IJCNN) 2022
DOI: 10.1109/ijcnn55064.2022.9892135
|View full text |Cite
|
Sign up to set email alerts
|

Privacy Enhancement for Cloud-Based Few-Shot Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 18 publications
0
10
0
Order By: Relevance
“…With a sufficiently diverse set of pictures to use as training data that include a variety of poses, lighting conditions, and backgrounds, a model trained using the methodology described herein is able to re‐identify even unseen individuals without the need for re‐training the model as seen in the open‐set evaluation metrics. This is achievable due to the inherent FSL properties (Parnami & Lee, 2022) of the methodology and will be especially useful in re‐identification of free‐ranging animals.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…With a sufficiently diverse set of pictures to use as training data that include a variety of poses, lighting conditions, and backgrounds, a model trained using the methodology described herein is able to re‐identify even unseen individuals without the need for re‐training the model as seen in the open‐set evaluation metrics. This is achievable due to the inherent FSL properties (Parnami & Lee, 2022) of the methodology and will be especially useful in re‐identification of free‐ranging animals.…”
Section: Discussionmentioning
confidence: 99%
“…One of the key developments in machine learning that made FSL methods attractive for the individual re‐identification problem domain was the development of loss functions specifically designed to work with small amounts of data (Parnami & Lee, 2022), such as a loss function known as triplet loss. A loss function is a metric that measures the error between the model's prediction and the ground truth data.…”
Section: Introductionmentioning
confidence: 99%
“…As the workflow shown in Figure 1, then, an approach using the few‐shot learning (FSL) method (FSL Abs ) was built to decipher the relationship from Abs value to the relative intensity of molecular markers in 12 samples determined by the ESI‐FT‐ICR‐MS (see Text S6 in Supporting Information ) (Wright & Ziegler, 2017). FSL is an algorithm of the ML that is aimed to learn the underlying pattern from a few samples (Parnami & Lee, 2022), and has been widely used in previous object detection (Kisantal et al., 2019), cheminformatics (Chen et al., 2023), and environmental studies (Huang et al., 2023). Here, we used the synthetic minority over‐sampling technique (SMOTE) (Chawla et al., 2002) combined with the random forest (RF) algorithm provided by Ranger package (Fan et al., 2023; Hong, Cao, Fan, Lin, Bao, et al., 2022; Wright & Ziegler, 2017) to successfully build an FSL model without the risk of overfitting (has been proved in a 55 × 35799 ESI‐FT‐ICR‐MS data set, see details in Text S8 of the Supporting Information ) (Belgiu & Drăguţ, 2016; Cortes‐Ciriano & Bender, 2015; Jablonka et al., 2020), and then proved by the validation data set and model outputs (Text S9 in Supporting Information ) (Arulkumaran et al., 2017).…”
Section: Methods and Data Analysismentioning
confidence: 99%
“…Fine-tune the original LLM model, which is computationally expensive; 11,12 2. Prompting within the LLM, which only accommodates small amounts of data and requires iterative user input; 13–15 and 3. Retrieval-augmented generation (RAG).…”
Section: Introductionmentioning
confidence: 99%