2024
DOI: 10.1109/tnnls.2022.3192065
|View full text |Cite
|
Sign up to set email alerts
|

Cardinality Constrained Portfolio Optimization via Alternating Direction Method of Multipliers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2024
2024
2025
2025

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 14 publications
(5 citation statements)
references
References 45 publications
0
5
0
Order By: Relevance
“…As for the comparison methods, in addition to the aforementioned NNOMP-PGD [8] and LAIT [5], the state-ofthe-art sparse index tracking methods, we also compare our method with ℓ 0 -ADMM (Alternating Direction Method of Multipliers) [40]. Despite how ℓ 0 -ADMM focuses on optimizing between return and risk, making it distinct from typical index tracking approaches, its implementation of an ℓ 0 -norm constraint mirrors that of NNOMP-PGD and our proposed method, warranting its inclusion in our comparison.…”
Section: Experiments a Dataset And Settingsmentioning
confidence: 99%
“…As for the comparison methods, in addition to the aforementioned NNOMP-PGD [8] and LAIT [5], the state-ofthe-art sparse index tracking methods, we also compare our method with ℓ 0 -ADMM (Alternating Direction Method of Multipliers) [40]. Despite how ℓ 0 -ADMM focuses on optimizing between return and risk, making it distinct from typical index tracking approaches, its implementation of an ℓ 0 -norm constraint mirrors that of NNOMP-PGD and our proposed method, warranting its inclusion in our comparison.…”
Section: Experiments a Dataset And Settingsmentioning
confidence: 99%
“…Cardinality: In this work, in order to avoid complicated mathematics, the cardinality constraint was not utilized, but it is suggested to use cardinality to extend the degree of the investor's control of the portfolio [62][63][64][65][66][67].…”
Section: Conclusion and Research Prospectsmentioning
confidence: 99%
“…6: Train this model based on the gradient descent method (13) until it converges. 7: By (15), calculate the thresholds τ. Use the Cond.RS (14) to select the important rules.…”
Section: First-order Sparse Tsk Nonstationary Fuzzy Neural Network (S...mentioning
confidence: 99%
“…The training of FNNs is a necessary work. There are many existing training algorithms, such as backpropagation [5], particle swarm algorithm [13], hybrid algorithm [14] and so on [15]. Although evolutionary algorithms and hybrid-type algorithms work well, they require considerable running time.…”
Section: Introductionmentioning
confidence: 99%