2018
DOI: 10.1007/s10288-018-0389-z
|View full text |Cite
|
Sign up to set email alerts
|

Variable-fixing then subgradient optimization guided very large scale neighborhood search for the generalized assignment problem

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
6
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(7 citation statements)
references
References 38 publications
0
6
0
1
Order By: Relevance
“…SLBLR outperforms SAVLR [17] as well, thereby demonstrating that the fast convergence offered by the novel "levelbased" stepsizing, with other things being equal, translates into better results as compared to those obtained by SAVLR, which employs the "contraction mapping" stepsizing [17]. Lastly, the methods developed in [18][19][20] specifically target GAPs, whereas the SLBLR method developed in this paper has broader applicability.…”
Section: Example 1: Generalized Assignment Problemsmentioning
confidence: 82%
See 1 more Smart Citation
“…SLBLR outperforms SAVLR [17] as well, thereby demonstrating that the fast convergence offered by the novel "levelbased" stepsizing, with other things being equal, translates into better results as compared to those obtained by SAVLR, which employs the "contraction mapping" stepsizing [17]. Lastly, the methods developed in [18][19][20] specifically target GAPs, whereas the SLBLR method developed in this paper has broader applicability.…”
Section: Example 1: Generalized Assignment Problemsmentioning
confidence: 82%
“…Column Generation [19], and Very Large Scale Neighborhood Search (VLNS) [20], which to the best of the authors' knowledge are the best methods for at least one of the above instances. For completeness, comparison against Surrogate Absolute-Value Lagrangian Relaxation (SAVLR) [17], which is an improved version of Surrogate Lagrangian Relaxation (SLR) [33], is also performed.…”
Section: Example 1: Generalized Assignment Problemsmentioning
confidence: 99%
“…SLBLR outperforms SAVLR [17] as well, thereby demonstrating that the fast convergence offered by the novel "level-based" stepsizing, with other things being equal, translates into better results as compared to those obtained by SAVLR, which employs the "contraction mapping" stepsizing [17]. Lastly, the methods developed in [18][19][20] specifically target GAPs, whereas the SLBLR method developed in this paper has broader applicability.…”
Section: Example 1: Generalized Assignment Problemsmentioning
confidence: 83%
“…To test scalability, six instances d201600, d401600, d801600, e201600, e401600 and e801600 are considered. SLBLR is compared with Depth-First Lagrangian Branch-and-Bound Method (DFLBnB) [18], Column Generation [19], and Very Large Scale Neighborhood Search (VLNS) [20], which to the best of the authors' knowledge are the best methods for at least one of the above instances. For completeness, comparison against Surrogate Absolute-Value Lagrangian Relaxation (SAVLR) [17], which is an improved version of Surrogate Lagrangian Relaxation (SLR) [33], is also performed.…”
Section: Example 1: Generalized Assignment Problemsmentioning
confidence: 99%
“…Az sayıda çalışma çok amaçlı yapıdadır ( [20]- [23]). GAP literatürü çözüm yöntemleri açısından incelendiğinde, çözüm yöntemlerinin kesin çözüm yöntemleri ( [12], [13], [18]- [20], [24], [25]) ve sezgisel/metasezgisel yöntemler ([1], [3]- [6], [8]- [10], [15]- [17], [19], [21]- [23], [25]- [29]) olarak gruplanabileceği görülmektedir.…”
Section: Gi̇ri̇ş (Introduction)unclassified