2018
DOI: 10.48550/arxiv.1806.06775
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Kernel-based Outlier Detection using the Inverse Christoffel Function

Abstract: Outlier detection methods have become increasingly relevant in recent years due to increased security concerns and because of its vast application to different fields. Recently, Pauwels and Lasserre (2016) noticed that the sublevel sets of the inverse Christoffel function accurately depict the shape of a cloud of data using a sumof-squares polynomial and can be used to perform outlier detection. In this work, we propose a kernelized variant of the inverse Christoffel function that makes it computationally trac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

1
5
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(6 citation statements)
references
References 5 publications
1
5
0
Order By: Relevance
“…Our main result quantitatively describes a decreasing relation between leverage score and population density for a broad class of kernels on Euclidean spaces. Numerical simulations support our findings.1 Kernelized Christoffel functions were first proposed by Laurent El Ghaoui and independently studied in [4].32nd Conference on Neural Information Processing Systems (NIPS 2018),…”
supporting
confidence: 68%
See 1 more Smart Citation
“…Our main result quantitatively describes a decreasing relation between leverage score and population density for a broad class of kernels on Euclidean spaces. Numerical simulations support our findings.1 Kernelized Christoffel functions were first proposed by Laurent El Ghaoui and independently studied in [4].32nd Conference on Neural Information Processing Systems (NIPS 2018),…”
supporting
confidence: 68%
“…1 Kernelized Christoffel functions were first proposed by Laurent El Ghaoui and independently studied in [4].…”
mentioning
confidence: 99%
“…The present paper significantly extends the theory of finite-sample error bounds for support set estimators derived from Christoffel functions by applying techniques from Bayesian PAC analysis, a variation of classical PAC analysis that has been successfully applied to Gaussian process classifiers [29], kernel support vector machines [18], and minimum-volume covering ellipsoids [12]. This extension leverages a formal connection between the kernel empirical inverse Christoffel function investigated by Askari et al [2] and the posterior variance of a Gaussian process regression model. In conjunction with the PAC-Bayes theorem, the connection can be used to derive finite-sample bounds for kernelized empirical inverse Christoffel functions.…”
mentioning
confidence: 91%
“…When the measure in question is defined by a probability distribution on R n the level sets of Christoffel functions are known empirically to provide tight approximations to the support. This support-approximating quality has motivated the use of Christoffel functions in several statistical applications, such as density estimation [19,20] and outlier detection [2]. Additionally, the level sets have been shown, using the plug-in approach [6], to converge exactly to the support of the distribution (in the sense of Hausdorff measure) when the degree of the polynomial approaches infinity and when the true probability distribution is available [20].…”
mentioning
confidence: 99%
See 1 more Smart Citation