2021
DOI: 10.48550/arxiv.2112.12181
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Simple and near-optimal algorithms for hidden stratification and multi-group learning

Abstract: Multi-group agnostic learning is a formal learning criterion that is concerned with the conditional risks of predictors within subgroups of a population. The criterion addresses recent practical concerns such as subgroup fairness and hidden stratification. This paper studies the structure of solutions to the multi-group learning problem, and provides simple and near-optimal algorithms for the learning problem.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 23 publications
0
2
0
Order By: Relevance
“…Concurrently and independently of our paper, Tosh and Hsu [Tosh and Hsu, 2021] study algorithms and sample complexity for multi-group agnostic learnability and give an algorithm ("Prepend") that is equivalent to our Algorithm 1 ("ListUpdate"). Their focus is on sample complexity of batch optimization, howeverin contrast to our focus on the discovery of groups on which our model is performing poorly online (e.g.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Concurrently and independently of our paper, Tosh and Hsu [Tosh and Hsu, 2021] study algorithms and sample complexity for multi-group agnostic learnability and give an algorithm ("Prepend") that is equivalent to our Algorithm 1 ("ListUpdate"). Their focus is on sample complexity of batch optimization, howeverin contrast to our focus on the discovery of groups on which our model is performing poorly online (e.g.…”
Section: Related Workmentioning
confidence: 99%
“…They also are not concerned with the details of the optimization that needs to be solved to produce an update -we give practical algorithms based on reductions to cost sensitive classification and empirical evaluation. Tosh and Hsu [Tosh and Hsu, 2021] also contains additional results, including algorithms producing more complex hypotheses but with improved sample complexity (again in the setting in which the groups are fixed up front).…”
Section: Related Workmentioning
confidence: 99%