2020
DOI: 10.1609/aaai.v34i04.6068
|View full text |Cite
|
Sign up to set email alerts
|

Transfer Learning for Anomaly Detection through Localized and Unsupervised Instance Selection

Abstract: Anomaly detection attempts to identify instances that deviate from expected behavior. Constructing performant anomaly detectors on real-world problems often requires some labeled data, which can be difficult and costly to obtain. However, often one considers multiple, related anomaly detection tasks. Therefore, it may be possible to transfer labeled instances from a related anomaly detection task to the problem at hand. This paper proposes a novel transfer learning algorithm for anomaly detection that selects … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 18 publications
(8 citation statements)
references
References 23 publications
0
8
0
Order By: Relevance
“…The transfer anomaly detection methods presented in [9]- [12] make the same assumption as this work; that is, as in our method, these studies use both anomalous and normal samples in a related dataset, also known as source data, and only normal data in the data of the interest, also known as the target dataset. The algorithms presented in [10], [11] are instance-based transfer learning methods, where the source data is transferred to the target domain after estimating the distribution of both source data and the target data. After the transfer, an LOF-based anomaly detection algorithm is built using transferred source data in [10] or using both transferred source data and unlabeled target data in [11].…”
Section: B Transfer Learning In Anomaly Detectionmentioning
confidence: 99%
See 1 more Smart Citation
“…The transfer anomaly detection methods presented in [9]- [12] make the same assumption as this work; that is, as in our method, these studies use both anomalous and normal samples in a related dataset, also known as source data, and only normal data in the data of the interest, also known as the target dataset. The algorithms presented in [10], [11] are instance-based transfer learning methods, where the source data is transferred to the target domain after estimating the distribution of both source data and the target data. After the transfer, an LOF-based anomaly detection algorithm is built using transferred source data in [10] or using both transferred source data and unlabeled target data in [11].…”
Section: B Transfer Learning In Anomaly Detectionmentioning
confidence: 99%
“…The algorithms presented in [10], [11] are instance-based transfer learning methods, where the source data is transferred to the target domain after estimating the distribution of both source data and the target data. After the transfer, an LOF-based anomaly detection algorithm is built using transferred source data in [10] or using both transferred source data and unlabeled target data in [11]. An important assumption of the methods presented in [10], [11] is that an ample amount of target data is needed to ensure accurate density estimation of the target data distribution.…”
Section: B Transfer Learning In Anomaly Detectionmentioning
confidence: 99%
“…The lack of labels in anomaly detection task that motivated our approach invalidates these approaches. LOCIT [47] selects labeled source instances to transfer by a local distributionbased approach and constructs a KNN classifier based on these selected source instances and unlabeled target instances. Although LOCIT [47] has the ability to handle the situation where source domain only contains normal instances, this method degenerates into a KNN-based unsupervised anomaly detection method without knowledge transfer.…”
Section: Related Workmentioning
confidence: 99%
“…Transfer knowledge between anomaly detection tasks which is called anomaly detection transfer have been proposed to learn anomaly detection by using normal and abnormal instances in the source domain. These methods also use target normal instances for training [3,17,22,46,48]. However, training with abnormal labels in some applications may cause problems.…”
Section: Introductionmentioning
confidence: 99%
“…The task setting was too basic, and the task requirements were much easier than it would be for typical ones in practical applications, where for example, the same machine type but different models are used at different operating speeds, and the conditions are not given as training data. Several independent research groups have tackled domain-shift or domain-adaptation related tasks [20][21][22][23][24][25], but few open datasets that could serve this need have been made available. Though the previous ToyADMOS dataset has some data variations that can be used for testing domain-shift conditions, we would like to have more variations on the test configuration.…”
Section: Introductionmentioning
confidence: 99%