2021
DOI: 10.48550/arxiv.2107.06321
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A New Multipoint Symmetric Secant Method with a Dense Initial Matrix

Abstract: In large-scale optimization, when either forming or storing Hessian matrices are prohibitively expensive, quasi-Newton methods are often used in lieu of Newton's method because they only require first-order information to approximate the true Hessian. Multipoint symmetric secant (MSS) methods can be thought of as generalizations of quasi-Newton methods in that they attempt to impose additional requirements on their approximation of the Hessian. Given an initial Hessian approximation, MSS methods generate a seq… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2
2
2

Relationship

3
3

Authors

Journals

citations
Cited by 6 publications
(6 citation statements)
references
References 8 publications
0
6
0
Order By: Relevance
“…Finally, we would like to identify specific features then utilize generalized linear models to more exactly quantify the relationship the variables have with mental health decline. We are also interested in analyzing accuracy, speed and feature scores of some machine learning models on the COVID-19 mental health data when we replace the most common optimizers such as GD, SGD or Limitedmemory BFGS with the ones that are introduced recently 32,33 .…”
Section: Discussionmentioning
confidence: 99%
“…Finally, we would like to identify specific features then utilize generalized linear models to more exactly quantify the relationship the variables have with mental health decline. We are also interested in analyzing accuracy, speed and feature scores of some machine learning models on the COVID-19 mental health data when we replace the most common optimizers such as GD, SGD or Limitedmemory BFGS with the ones that are introduced recently 32,33 .…”
Section: Discussionmentioning
confidence: 99%
“…In future work, we are interested in analyzing more data related to healthcare workers to examine the relationship between their mental health and other factors related or unrelated to COVID-19. We are also interested in implementing other optimization methods such as those that are introduced in [55] and [56] to see whether accuracy or speed of some models, which were utilized in this paper, can be improved.…”
Section: Discussionmentioning
confidence: 99%
“…In supervised learning, we convert the learning problem into an optimization problem by defining a loss function as the objective function. Optimization methods (e.g., [37][38][39][40][41][42][43][44][45][46]), which are often used to minimize a loss or error function in the model training process, play an important role in the speed-accuracy trade-off of machine learning algorithms. In the remainder of this section, we briefly review all supervised and unsupervised learning methods used in this paper.…”
Section: Artificial Intelligence Reviewmentioning
confidence: 99%