2015
DOI: 10.1109/tsp.2015.2399870
|View full text |Cite
|
Sign up to set email alerts
|

Analytic Representation of Bayes Labeling and Bayes Clustering Operators for Random Labeled Point Processes

Abstract: Clustering algorithms typically group points based on some similarity criterion, but without reference to an underlying random process to make clustering algorithms rigorously predictive. In fact, there exists a probabilistic theory of clustering in the context of random labeled point sets in which clustering error is defined in terms of the process. In the present paper, given an underlying point process we develop a general analytic procedure for finding an optimal clustering operator, the Bayes clusterer, t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
22
0
1

Year Published

2015
2015
2022
2022

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(23 citation statements)
references
References 23 publications
(27 reference statements)
0
22
0
1
Order By: Relevance
“…In this setting, the general objective of clustering is to group those nodes that are more similar to each other than to the rest, according to the relationship established by the dissimilarity function [1], [2]. Clustering and its generalizations are ubiquitous tools since they are used in a wide variety of fields such as psychology [3], social network analysis [4], political science [5], neuroscience [6], among many others [7], [8].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In this setting, the general objective of clustering is to group those nodes that are more similar to each other than to the rest, according to the relationship established by the dissimilarity function [1], [2]. Clustering and its generalizations are ubiquitous tools since they are used in a wide variety of fields such as psychology [3], social network analysis [4], political science [5], neuroscience [6], among many others [7], [8].…”
Section: Introductionmentioning
confidence: 99%
“…In a strict sense, the proof of Theorem 2 does not require c X to be a cut metric. In fact, any nonnegative, symmetric function satisfying the identity property would suffice to create a relationship between nodes as in (7). Constructing such a relation is known as the Rips complex [30].…”
mentioning
confidence: 99%
“…Our interest here is in clustering, where the underlying process is a random point set and the aim is to partition the point set into clusters corresponding to the manner in which the points have been generated by the underlying process. Having developed the theory of optimal clustering in the context of random labeled point sets where optimality is with respect to mis-clustered points [1], we now consider optimal clustering when the underlying random labeled point process belongs to an uncertainty class of random labeled point processes, so that optimization is relative to both clustering error and model uncertainty. This is analogous to finding an optimal Wiener filter when the signal process is unknown, so that the power spectra belong to an uncertainty class [2].…”
Section: Introductionmentioning
confidence: 99%
“…The exceptions, for instance, expectationmaximization based on mixture models, typically focus on parameter estimation rather than defining and minimizing a notion of operator error. Work in [9] and [1] addresses the solution to (1) in the context of clustering using a probabilistic theory of clustering for random labeled point sets and a definition of clustering error given by the expected number of "misclustered" points. This results in a Bayes clusterer, which minimizes error under the assumed probabilistic framework.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation