Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 2016
DOI: 10.1145/2939672.2939832
|View full text |Cite
|
Sign up to set email alerts
|

Online Optimization Methods for the Quantification Problem

Abstract: The estimation of class prevalence, i.e., the fraction of a population that belongs to a certain class, is a very useful tool in data analytics and learning, and finds applications in many domains such as sentiment analysis, epidemiology, etc. For example, in sentiment analysis, the objective is often not to estimate whether a specific text conveys a positive or a negative sentiment, but rather estimate the overall distribution of positive and negative sentiments during an event window. A popular way of perfor… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
35
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 45 publications
(35 citation statements)
references
References 30 publications
0
35
0
Order By: Relevance
“…The recent years have seen much interest, as well as progress, in training directly with task-specific performance measures in the field of classification and ranking. Some notable works include those of [10,15] that investigate the statistical properties of plug-in classifiers for various non-decomposable objectives including F-measure, and [7,8,12,13] which propose stochastic gradient-style algorithms for optimizing non-decomposable performance measures such as F-measure, KL-divergence, area under the ROC curve (AUC), precision recall curve (AUCPR), recall at fixed precision (R@P), etc. However, all the works cited above focus only on training linear models.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…The recent years have seen much interest, as well as progress, in training directly with task-specific performance measures in the field of classification and ranking. Some notable works include those of [10,15] that investigate the statistical properties of plug-in classifiers for various non-decomposable objectives including F-measure, and [7,8,12,13] which propose stochastic gradient-style algorithms for optimizing non-decomposable performance measures such as F-measure, KL-divergence, area under the ROC curve (AUC), precision recall curve (AUCPR), recall at fixed precision (R@P), etc. However, all the works cited above focus only on training linear models.…”
Section: Related Workmentioning
confidence: 99%
“…, (x n , y n )}, denote the sample average asP S (w) = 1 n n i=1 r + (w; x i , y i ) and similarly define N (w),N S (w). Unlike previous work [7,14], we will not restrict ourselves to concave surrogate reward functions. In particular we will utilize the sigmoidal reward, which is widely used as an activation function in neural networks is non-concave: r sigmoid (ŷ, y) = (1 + exp(−y ·ŷ)) −1…”
Section: Namementioning
confidence: 99%
See 2 more Smart Citations
“…Along with the analytical methods, some computational methods to solve nonlinear problems from engineering and computer sciences have been developed by Li et al [17][18][19][20][21][22][23][24], Guo et al [25], Korda et al [26].…”
Section: Introductionmentioning
confidence: 99%