2018
DOI: 10.1007/978-3-030-01424-7_27
|View full text |Cite
|
Sign up to set email alerts
|

A Survey on Deep Transfer Learning

Abstract: As a new classification platform, deep learning has recently received increasing attention from researchers and has been successfully applied to many domains. In some domains, like bioinformatics and robotics, it is very difficult to construct a large-scale well-annotated dataset due to the expense of data acquisition and costly annotation, which limits its development. Transfer learning relaxes the hypothesis that the training data must be independent and identically distributed (i.i.d.) with the test data, w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
1,420
0
18

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 2,293 publications
(1,440 citation statements)
references
References 23 publications
2
1,420
0
18
Order By: Relevance
“…In addition, custom-built CNNs as well as pre-trained models are supported. Pre-trained models can be used either for classifying new sets of images or for re-purposed training on a different classification task, a technique termed transfer learning (Tan et al, 2018). At the initiation of a training process, AID automatically generates the neural net architecture according to the requested input and output dimensions.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…In addition, custom-built CNNs as well as pre-trained models are supported. Pre-trained models can be used either for classifying new sets of images or for re-purposed training on a different classification task, a technique termed transfer learning (Tan et al, 2018). At the initiation of a training process, AID automatically generates the neural net architecture according to the requested input and output dimensions.…”
Section: Resultsmentioning
confidence: 99%
“…This issue can be tackled by increasing the amount of data. Here, due to the limited availability of labelled images, we applied a transfer learning approach, which was shown to reduce the need of data (Tan et al, 2018). We reached a validation accuracy of 92.1% by first training the model on CIFAR-10, and continuing training using the image data from differentiated MSCs.…”
Section: Quantification Of Adipogenic Differentiation Of Mscsmentioning
confidence: 99%
See 1 more Smart Citation
“…LSTM architectures excel at handling variable length sequence inputs and can learn longterm dependencies between non-contiguous elements, enabling an input encoding that does not require peptide shortening or splitting ( Figure 1B). The networks were trained with a transfer-learning protocol (32), which allows networks performing predictions for less wellcharacterized alleles to leverage information from extensively studied alleles ( Figure 1C). Transfer learning was also used to train networks using both binding affinity and HLAp datasets.…”
Section: Methods: Implementationmentioning
confidence: 99%
“…These values cover the vast majority lengths observed in naturally presented MHC bound peptides (12). The networks were trained with transfer learning (26), which allows networks for less well-characterized alleles to leverage information from extensively studied alleles ( Figure 1C). Transfer learning was also used to train networks combining binding affinity and HLAp datasets.…”
Section: Methods: Implementationmentioning
confidence: 99%