2022
DOI: 10.3390/rs14205105
|View full text |Cite
|
Sign up to set email alerts
|

Continual Contrastive Learning for Cross-Dataset Scene Classification

Abstract: With the development of remote sensing technology, the continuing accumulation of remote sensing data has brought great challenges to the remote sensing field. Although multiple deep-learning-based classification methods have made great progress in scene classification tasks, they are still unable to address the problem of model learning continuously. Facing the constantly updated remote sensing data stream, there is an inevitable problem of forgetting historical information in the model training, which leads … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 47 publications
0
4
0
Order By: Relevance
“…In CV, the combination of these two technologies is starting to be explored, and some works have already started to demonstrate how SSL methods are feasible to learn incrementally [17], [29], [30]. In EO, this combination has not yet been explored, to the best of authors' knowledge, apart from a few contributions combining weakly-supervision [31] and contrastive losses [32], [33] with CL. On the other hand, we observe an increasing interest in foundation models [15], [34], based also on new extensive datasets [14], [27], [28].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In CV, the combination of these two technologies is starting to be explored, and some works have already started to demonstrate how SSL methods are feasible to learn incrementally [17], [29], [30]. In EO, this combination has not yet been explored, to the best of authors' knowledge, apart from a few contributions combining weakly-supervision [31] and contrastive losses [32], [33] with CL. On the other hand, we observe an increasing interest in foundation models [15], [34], based also on new extensive datasets [14], [27], [28].…”
Section: Related Workmentioning
confidence: 99%
“…In [59], continual prototype calibration is proposed for few-shot classification CL. The authors in [32] and [33] make use of contrastive learning to learn effective representations that can reduce catastrophic forgetting. A fine-grained CL algorithm for SAR incremental target recognition is presented in [60].…”
Section: B Continual Learning In Eomentioning
confidence: 99%
“…This significantly reduces the need for a large number of accurately labelled samples. For example, Peng Rui et al [46] used a contrast learning model and label propagation method to generate a large number of high-confidence labels in multi-scale unlabelled data, and finally used the expanded samples in a weakly supervised network to obtain the classification results of scene images. Liang et al [47] constructed a weakly supervised semantic segmentation network based on conditional generative adversarial networks and used a self-training method to generate pseudo-labels of unlabelled data by a generator to achieve weakly supervised semantic segmentation.…”
Section: Introductionmentioning
confidence: 99%
“…Solutions dedicated to one area tend to fail the corner or generally less common observations. Thus, they better have long-term support [5]. The alternative can choose to be multi-functional and sufficiently capable of current concerns, meaning a successful algorithm should address its majors well and easily generalize to the rare rest [6].…”
Section: Introductionmentioning
confidence: 99%