2022
DOI: 10.1017/9781009004282
|View full text |Cite
|
Sign up to set email alerts
|

Optimization for Data Analysis

Abstract: Optimization techniques are at the core of data science, including data analysis and machine learning. An understanding of basic optimization techniques and their fundamental properties provides important grounding for students, researchers, and practitioners in these areas. This text covers the fundamentals of optimization algorithms in a compact, self-contained way, focusing on the techniques most relevant to data science. An introductory chapter demonstrates that many standard problems in data science can b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3
2

Relationship

1
9

Authors

Journals

citations
Cited by 21 publications
(14 citation statements)
references
References 0 publications
0
14
0
Order By: Relevance
“…Alternatively, we may view these subproblems from the point of view of machine learning, where (essentially) the same class of optimization problems is solved by (possibly) different techniques. We refer the interested reader to [54,60] for a survey of optimization methods for machine learning and data analysis problems. These techniques might be applicable very successfully at least in certain situations.…”
Section: Using the Decomposition Lmentioning
confidence: 99%
“…Alternatively, we may view these subproblems from the point of view of machine learning, where (essentially) the same class of optimization problems is solved by (possibly) different techniques. We refer the interested reader to [54,60] for a survey of optimization methods for machine learning and data analysis problems. These techniques might be applicable very successfully at least in certain situations.…”
Section: Using the Decomposition Lmentioning
confidence: 99%
“…Remark 2. Note that studying convergence of lower level optimization problem, which is extensively studied in existing literature (Nesterov et al, 2018;Wright and Recht, 2022), is not the focus of this article. Assumption 5 holds for convex functions satisfying quadratic growth condition (Karimi et al, 2016) Meanwhile, Assumption (6) holds strongly convex functions, for example.…”
Section: Convergence To Stationary Pointsmentioning
confidence: 99%
“…Compared with the original problem (4), the problem in ( 5) is an unconstrained optimization problem with some simple structure so that we can apply some efficient optimization algorithms, such as stochastic gradient descent (Wright and Recht, 2022).…”
Section: A Computationally Efficient Formulationmentioning
confidence: 99%