2011
DOI: 10.1007/978-3-642-24471-1_13
|View full text |Cite
|
Sign up to set email alerts
|

Multiple-Instance Learning with Instance Selection via Dominant Sets

Abstract: Abstract. Multiple-instance learning (MIL) deals with learning under ambiguity, in which patterns to be classified are described by bags of instances. There has been a growing interest in the design and use of MIL algorithms as it provides a natural framework to solve a wide variety of pattern recognition problems. In this paper, we address MIL from a view that transforms the problem into a standard supervised learning problem via instance selection. The novelty of the proposed approach comes from its selectio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 15 publications
(9 citation statements)
references
References 18 publications
0
9
0
Order By: Relevance
“…The averaging operation mitigates the effects of positive instance in negative bags. In [94], robustness to label noise is obtained by using dominant sets to perform clustering and select relevant instance prototype in a bag-embedding algorithm similar to MILES [4].…”
Section: Label Noisementioning
confidence: 99%
“…The averaging operation mitigates the effects of positive instance in negative bags. In [94], robustness to label noise is obtained by using dominant sets to perform clustering and select relevant instance prototype in a bag-embedding algorithm similar to MILES [4].…”
Section: Label Noisementioning
confidence: 99%
“…Such methods include clustering the prototypes and then selecting the cluster centers as prototypes. In [20] and [21], clustering of instances is performed. Erdem and Erdem [20] first rely on the standard assumption to cluster and prune the instances of negative bags, creating negative prototypes.…”
Section: B Classifier and Informative Prototypesmentioning
confidence: 99%
“…In [20] and [21], clustering of instances is performed. Erdem and Erdem [20] first rely on the standard assumption to cluster and prune the instances of negative bags, creating negative prototypes. The positive prototypes are created by selecting from each bag the instance that is furthest from the negative prototypes.…”
Section: B Classifier and Informative Prototypesmentioning
confidence: 99%
“…Multiple instance learning (MIL), i.e., learning from ambiguous data (the labels are related to bags, not instances within the bags, meaning that we only have partial or incomplete knowledge about training instances), has been widely studied and applied to many challenging tasks, such as text categorization [1], object tracking [2], person re-identification [3], computer-aided medical diagnosis [4], etc. Therefore, it has received considerable attention, and various algorithms, for example APR [5], DD [6], Citation-KNN [7], EM-DD [8], MI-Kernel [9], miSVM and MISVM [10], DD-SVM [11], MILES [12], MissSVM [13], MIGraph and miGraph [14], MILIS [15], MILDS [16], MILEAGE [17], mi-DS [18], CK_MIL [19], SMILE [20], MIKI [21], TreeMIL [22], MILDM [23], mi-Net and MI-Net [24], Attention and Gated-Attention MIL [25], etc., have been proposed to deal with the MIL problem. However, there are two issues that hinder its practical application.…”
Section: Introductionmentioning
confidence: 99%