2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2015
DOI: 10.1109/cvpr.2015.7299056
|View full text |Cite
|
Sign up to set email alerts
|

Multiple instance learning for soft bags via top instances

Abstract: A generalized formulation of the multiple instance learning problem is considered. Under this formulation, both positive and negative bags are soft, in the sense that negative bags can also contain positive instances. This reflects a problem setting commonly found in practical applications, where labeling noise appears on both positive and negative training samples. A novel bag-level representation is introduced, using instances that are most likely to be positive (denoted top instances), and its ability to se… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
43
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 63 publications
(44 citation statements)
references
References 20 publications
1
43
0
Order By: Relevance
“…The corresponding parameters are described in Table 1. The standard max-pooling MIL approach [44] is obtained with only one element, and both top instance model [39], Learning with Label Proportion [65] and global average pooling [70] can be obtained with more. Drawing from negative evidence [47,12,13] we can incorporate minimum scoring regions to support classification and our spatial pooling function can reduce to the kMax+kMin layer of [13].…”
Section: Wildcat Poolingmentioning
confidence: 99%
See 2 more Smart Citations
“…The corresponding parameters are described in Table 1. The standard max-pooling MIL approach [44] is obtained with only one element, and both top instance model [39], Learning with Label Proportion [65] and global average pooling [70] can be obtained with more. Drawing from negative evidence [47,12,13] we can incorporate minimum scoring regions to support classification and our spatial pooling function can reduce to the kMax+kMin layer of [13].…”
Section: Wildcat Poolingmentioning
confidence: 99%
“…Max pooling [44] only selects the most informative region for the MIL prediction. Recent alternatives include Global Average Pooling (GAP) [70], soft max in LSE pooling [58], Learning from Label Proportion (LLP) [65,36], and top max scoring [39]. Negative evidence models [47,12,13] explicitly select regions accounting for the absence of the class.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Positive and Negative set kernels [20] sMIL [24] KI-SVM [22] miGraph [25] MI-SVM [15] MissSVM [26] soft-bag SVM [10] dMIL [11] Positive and Unlabeled PU-SKC (Sect. 3) puMIL [27]…”
Section: Convexitymentioning
confidence: 99%
“…Later, MIL has been considered as a graph-based learning problem [2,3,4,5,6]. In fact, MIL is applicable to a wide range of real-world problems such as molecule behavior prediction [7], drug activity prediction [1], domain theory [8], content-based image retrieval [9,10,11], visual tracking [12], object detection [13,14], text categorization [15], and medical diagnosis [16,17].…”
Section: Introductionmentioning
confidence: 99%