2019
DOI: 10.3390/a12120249
|View full text |Cite
|
Sign up to set email alerts
|

SVM-Based Multiple Instance Classification via DC Optimization

Abstract: A multiple instance learning problem consists of categorizing objects, each represented as a set (bag) of points. Unlike the supervised classification paradigm, where each point of the training set is labeled, the labels are only associated with bags, while the labels of the points inside the bags are unknown. We focus on the binary classification case, where the objective is to discriminate between positive and negative bags using a separating surface. Adopting a support vector machine setting at the training… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 11 publications
(7 citation statements)
references
References 38 publications
(56 reference statements)
0
7
0
Order By: Relevance
“…[7]), an approach exhibiting both good generalisation capabilities and high computational efficiency due to only requiring the solution of a convex problem for the training phase. The latter characteristic allows to experiment with several different variants of the basic model in order to adapt it to different setting, see for example the recent works [2,3,6,9]. The main idea in the SVM technique is the introduction of the concept of "margin" in the strict separation of two sets of points by means of a hyperplane.…”
Section: The Modelmentioning
confidence: 99%
“…[7]), an approach exhibiting both good generalisation capabilities and high computational efficiency due to only requiring the solution of a convex problem for the training phase. The latter characteristic allows to experiment with several different variants of the basic model in order to adapt it to different setting, see for example the recent works [2,3,6,9]. The main idea in the SVM technique is the introduction of the concept of "margin" in the strict separation of two sets of points by means of a hyperplane.…”
Section: The Modelmentioning
confidence: 99%
“…Function e(w, γ ) is nonsmooth and nonconvex, but it can be put in DC (Difference of Convex) form (Le Thi and Pham Dinh 2005). This formulation is similar to those adopted in Andrews et al (2003) (MI-SVM formulation) and Bergeron et al (2012), while the DC decomposition has been exploited in Astorino et al (2019c). The reader will find a fresh survey on nonsmooth optimization methods in Gaudioso et al (2020a).…”
Section: The Approachmentioning
confidence: 99%
“…In particular, in Andrews et al (2003) the first SVM (Support Vector Machine) type model for MIL has been proposed, giving rise to a nonlinear mixed integer program solved by means of a BCD (Block Coordinate Descent) approach (Tseng 2001). The same SVM type model treated in Andrews et al (2003) has been faced in Astorino et al (2019a) by means of a Lagrangian relaxation technique, while in Astorino et al (2019c) and Bergeron et al (2012) a MIL linear separation has been obtained by using some ad hoc nonsmooth approaches. In Mangasarian and Wild (2008), the authors have proposed an instance-space algorithm, expressing each positive bag as convex combination of its instances, whereas in Avolio and Fuduli (2021) a combination of the SVM and the PSVM (Proximal Support Vector Machine) approaches has been adopted.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Finally, the article "SVM-Based Multiple Instance Classification via DC Optimization" by Annabella Astorino, Antonio Fuduli, Giovanni Giallombardo and Giovanna Miglionico considers the binary classification of the multiple instance learning problem [18]. The problem is formulated as a nonconvex unconstrained NSO problem with a DC objective function, and an appropriate nonsmooth DC algorithm is used to solve this problem.…”
Section: Nonsmooth Optimizationmentioning
confidence: 99%