2022
DOI: 10.1016/j.array.2022.100182
|View full text |Cite
|
Sign up to set email alerts
|

Improving novelty detection using the reconstructions of nearest neighbours

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1

Relationship

2
4

Authors

Journals

citations
Cited by 8 publications
(17 citation statements)
references
References 21 publications
0
16
1
Order By: Relevance
“…This is problematic as RFI detection algorithms must produce a Boolean mask of the pixel-precise regions of where RFI is located in a given input. Therefore, we propose to use the Nearest-Latent-Neighbours (NLN) algorithm, an approach that combines both latent measures of difference as well as the pixel-precise reconstruction errors from a generative autoencoding model Mesarcik et al (2022b). This is in contrast to Harrison et al (2019), where particular features are extracted from each spectrogram to determine the novel differences in transmission power and frequency range.…”
Section: Novelty Detection In Radio Astronomymentioning
confidence: 99%
See 1 more Smart Citation
“…This is problematic as RFI detection algorithms must produce a Boolean mask of the pixel-precise regions of where RFI is located in a given input. Therefore, we propose to use the Nearest-Latent-Neighbours (NLN) algorithm, an approach that combines both latent measures of difference as well as the pixel-precise reconstruction errors from a generative autoencoding model Mesarcik et al (2022b). This is in contrast to Harrison et al (2019), where particular features are extracted from each spectrogram to determine the novel differences in transmission power and frequency range.…”
Section: Novelty Detection In Radio Astronomymentioning
confidence: 99%
“…To solve these problems we propose an unsupervised learning method based on the Nearest Latent Neighbours (NLN) algorithm Mesarcik et al (2022b). This approach leverages novelty detection to perform RFI detection.…”
Section: Introductionmentioning
confidence: 99%
“…The formula for calculating the Euclidean distance can be seen in equation 2.5: The K-Nearest Neighbor method is one of the supervised learning methods whose classification process is based on the distance of the closest neighbors from the training dataset. K-Nearest Neighbor is a method that is often used to implement distance calculations in the classification process because it has a simple formula [17]. The advantages of this method are that it has resistance to training data that has a lot of noise and is effectively used for large training data.…”
Section: K-nearest Neighbormentioning
confidence: 99%
“…In addition, the text will be converted to numbers to be able to proceed to the classification stage. This conversion is carried out using the TF-IDF (Term Frequency-Inverse Document Frequency) method [17].…”
Section: Word Weightingmentioning
confidence: 99%
“…The nearest latent neighbours (NLNs) algorithm proposed by Mesarcik et al. (2022a) introduces a novel approach to RFI detection by framing it as a downstream anomaly detection task based on a generative model trained to represent only noiseless radio data as flagged by AOFlagger produced by Offringa (2010). This methodology significantly reduces the dependence on high-quality flags in the training data, making it more suitable for real-world deployment.…”
Section: Introductionmentioning
confidence: 99%