2018
DOI: 10.1049/iet-ipr.2017.0939
|View full text |Cite
|
Sign up to set email alerts
|

Level set evolution with sparsity constraint for object extraction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
11
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 19 publications
(11 citation statements)
references
References 49 publications
0
11
0
Order By: Relevance
“…The optimal distribution centers and distribution addressing schemes are shown in Tables 17 and 18. According to Tables 17 and 18, the optimal distribution center points found by CS algorithm for 6 and 10 distribution centers are (3,11,22,1,15,20) and (6,8,18,11,21,28,16,1,20,15). The optimal distribution center points found by IGA algorithm for 6 and 10 distribution centers are (10,22,21,2,20,17) and (30,23,14,1,2,11,25,24,15,4).…”
Section: Analysis Of Experimental Resultsmentioning
confidence: 98%
See 2 more Smart Citations
“…The optimal distribution centers and distribution addressing schemes are shown in Tables 17 and 18. According to Tables 17 and 18, the optimal distribution center points found by CS algorithm for 6 and 10 distribution centers are (3,11,22,1,15,20) and (6,8,18,11,21,28,16,1,20,15). The optimal distribution center points found by IGA algorithm for 6 and 10 distribution centers are (10,22,21,2,20,17) and (30,23,14,1,2,11,25,24,15,4).…”
Section: Analysis Of Experimental Resultsmentioning
confidence: 98%
“…The optimal distribution center points found by IGA algorithm for 6 and 10 distribution centers are (10,22,21,2,20,17) and (30,23,14,1,2,11,25,24,15,4). The optimal distribution center points found by CCS algorithm for 6 and 10 distribution centers are (23,22,21,16,15,20) and (6,10,23,14,22,25,7,16,15,20).…”
Section: Analysis Of Experimental Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The traditional method is difficult to solve these problems, so the intelligent algorithms are required to solve these problems. Inspired by nature, these strong metaheuristic algorithms are applied to solve NP-hard problems such as scheduling [3][4][5][6][7], image [8][9][10], feature selection [11][12][13] and detection [14][15][16], path planning [17,18], cyber-physical social system [19,20], texture discrimination [21], factor evaluation [22], saliency detection [23], classification [24,25], object extraction [26], gesture segmentation [27], economic load dispatch [28,29], shape design [30], big data and large-scale optimization [31][32][33], signal processing [34], silencing efficacy prediction [35], multi-objective optimization [36,37], unit commitment [38,39], vehicle routing [40], knapsack problem [41][42][43] and fault diagnosis [44][45][46], and test-sheet composition…”
Section: Introductionmentioning
confidence: 99%
“…As we know, sparse reconstruction model can reconstruct images from highly undersampled observations accurately and solve the problems of limited acquisition data and slow acquisition time in MRI 2‐6 . The target of sparse reconstruction is mainly to find the sparse solution x of the underdetermined linear equations Ax = y , aiming at reconstructing the original images with highly undersampled observations 7‐12 . Sparse reconstruction model can be expressed as minx12yAx22+italicλφ()x, where x ∈ ℝ N denotes the magnetic resonance image to be reconstructed, y ∈ ℝ M denotes the k‐space signal acquired by the magnetic resonance coil, A ∈ ℝ M × N denotes the undersampling Fourier transform and M < N , usually we have A = R × F , R ∈ ℝ M × N is an undersampling template, 13‐15 such as radial sampling template, variable density random sampling template, and Cartesian sampling template, etc., F ∈ ℝ N × N is a sparse Fourier operator, such as Fourier transform and wavelet transform, etc., φ ( x ) is used as regularizer term.…”
Section: Introductionmentioning
confidence: 99%