1999
DOI: 10.1007/bf02564711
|View full text |Cite
|
Sign up to set email alerts
|

Extensions of Dinkelbach's algorithm for solving non-linear fractional programming problems

Abstract: Fractional Programming, Parametric Optimization, Nonconvex Programming, Nonlinear Programming, AMS subject classification, 90C32, 90C31, 90C26, 90C30,

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
32
0

Year Published

2001
2001
2020
2020

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 41 publications
(32 citation statements)
references
References 19 publications
0
32
0
Order By: Relevance
“…Readers may refer to [42][43] and references therein for details. Example 3.1 Consider the following linear fractional optimization problem subject to a system of sup-T P equations, which was originally presented by Wu et al [33] : …”
Section: Dinkelbach's Algorithmmentioning
confidence: 99%
“…Readers may refer to [42][43] and references therein for details. Example 3.1 Consider the following linear fractional optimization problem subject to a system of sup-T P equations, which was originally presented by Wu et al [33] : …”
Section: Dinkelbach's Algorithmmentioning
confidence: 99%
“…Moreover, constraint is concave, while all the other constraints are linear in a q , k , ς . Therefore, the solution to this problem can be obtained by solving a sequence of maximization problems that satisfy optimality conditions …”
Section: Max‐min Energy‐efficient Power Allocationmentioning
confidence: 99%
“…Therefore, the solution to this problem can be obtained by solving a sequence of maximization problems that satisfy optimality conditions. 24,25 Based on Remark 5, and to solve problem R-MMEE-PA ( q,k, )…”
Section: Max-min Energy-efficient Power Allocationmentioning
confidence: 99%
“…In particular, Cour and Shi (2007) showed that it is NP-hard, in general, to solve for general eigenvectors under linear inequalities. So, we propose an iterative algorithm to find the global minimum of this optimization problem using Dinkelbach's method for fractional programming (FP) (Dinkelbach 1967;Rodenas et al 1999). To make the paper self-contained, we give a brief description of this algorithm next.…”
Section: Dncut Framework Under Hard and Convex Constraintsmentioning
confidence: 99%
“…The Dinkelbach algorithm was extended by Rodenas et al (1999) to provide a general framework for FP's, summarized below as Algorithm 1. λ * is the global minimum value of the objective function. Here, we emphasize that the superlinear convergence property is only regarding the iterations needed to achieve λ * and not the convergence of each iteration.…”
Section: Dinkelbach Algorithm For Fractional Programmingmentioning
confidence: 99%