2019
DOI: 10.48550/arxiv.1903.00816
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Stability of decision trees and logistic regression

Nino Arsov,
Martin Pavlovski,
Ljupco Kocarev

Abstract: Decision trees and logistic regression are one of the most popular and well-known machine learning algorithms, frequently used to solve a variety of real-world problems. Stability of learning algorithms is a powerful tool to analyze their performance and sensitivity and subsequently allow researchers to draw reliable conclusions. The stability of these two algorithms has remained obscure. To that end, in this paper, we derive two stability notions for decision trees and logistic regression: hypothesis and poin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 10 publications
0
3
0
Order By: Relevance
“…All the methods are conducted using the target level α as input. Although CV+ only has a 1 À 2α finite sample guarantee, its empirical coverage is approximately 1 À α under most settings and thus using α rather than α=2 is the suggestion of Barber et al (2021). We perform 100 independent repetitions for all settings and datasets to obtain the results shown in this section.…”
Section: Numerical Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…All the methods are conducted using the target level α as input. Although CV+ only has a 1 À 2α finite sample guarantee, its empirical coverage is approximately 1 À α under most settings and thus using α rather than α=2 is the suggestion of Barber et al (2021). We perform 100 independent repetitions for all settings and datasets to obtain the results shown in this section.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…Thus, various learning algorithms are known to enjoy mean-square stability. For example, k-nearest neighbor (Devroye & Wagner, 1979a), support vector machines (Bousquet & Elisseeff, 2002) and bagging methods (Arsov et al, 2019) all have mean-square stability with γ n ¼ oð1=nÞ.…”
Section: Theoretical Analysismentioning
confidence: 99%
“…and investigate logistic regression under proportionally high-dimensional settings via the approximate message passing theory, show the biased behaviours of the ordinary maximum likelihood estimation, and propose procedures for debiased estimation and hypothesis testing. Arsov et al (2019) study stability and generalization errors for some models including logistic regression. Salehi et al (2019) investigate the impact of some regularization schemes on logistic regression under proportionally high-dimensional settings via the convex Gaussian min-max theorem.…”
Section: High-dimensional Generalized Linear Modelsmentioning
confidence: 99%