Proceedings of the 31st ACM International Conference on Information &Amp; Knowledge Management 2022
DOI: 10.1145/3511808.3557234
|View full text |Cite
|
Sign up to set email alerts
|

An Accelerated Doubly Stochastic Gradient Method with Faster Explicit Model Identification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 23 publications
0
3
0
Order By: Relevance
“…51 Furthermore, there are studies on GAP safe screening for optimization algorithm based on stochastic gradient descent. 52 , 53 …”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…51 Furthermore, there are studies on GAP safe screening for optimization algorithm based on stochastic gradient descent. 52 , 53 …”
Section: Resultsmentioning
confidence: 99%
“…Several studies exist in the paradigm of stochastic gradient method. 52 , 53 Although it is not clear that our method can be directly applied to them, this topic is worth discussing in future work. Additionally, the sphere refinement technique 51 can further accelerate SPP more in some loss functions.…”
Section: Discussionmentioning
confidence: 99%
“…The challenges of MLaaS comes from several folds: inference latency and privacy. To accelerate the MLaaS training and inference application, accelerated gradient sparsification [12], [13] and model compression methods [14]- [22] are proposed. On the other side, a major limitation of MLaaS is the requirement for clients to reveal raw input data to the service provider, which may compromise the privacy of users.…”
Section: Introductionmentioning
confidence: 99%