2018
DOI: 10.1007/s10278-018-0107-6
|View full text |Cite
|
Sign up to set email alerts
|

AdaptAhead Optimization Algorithm for Learning Deep CNN Applied to MRI Segmentation

Abstract: Deep learning is one of the subsets of machine learning that is widely used in artificial intelligence (AI) field such as natural language processing and machine vision. The deep convolution neural network (DCNN) extracts high-level concepts from low-level features and it is appropriate for large volumes of data. In fact, in deep learning, the high-level concepts are defined by low-level features. Previously, in optimization algorithms, the accuracy achieved for network training was less and high-cost function… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
33
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 49 publications
(33 citation statements)
references
References 16 publications
0
33
0
Order By: Relevance
“…The adaptive gradient descent in optimization provides adaptation to the previous gradient. Acceleration by Nesterov's momentum has been combined with the adaptive gradient descent because momentum helps to improve the convergence of adaptive gradient methods . The momentum term (ie, δisi*2in Equation ) accelerates convergence of deterministic components, while adaptive gradient concentrates over the stochastic components.…”
Section: Proposed Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The adaptive gradient descent in optimization provides adaptation to the previous gradient. Acceleration by Nesterov's momentum has been combined with the adaptive gradient descent because momentum helps to improve the convergence of adaptive gradient methods . The momentum term (ie, δisi*2in Equation ) accelerates convergence of deterministic components, while adaptive gradient concentrates over the stochastic components.…”
Section: Proposed Methodsmentioning
confidence: 99%
“…Acceleration by Nesterov's momentum has been combined with the adaptive gradient descent because momentum helps to improve the convergence of adaptive gradient methods. 30,64 The momentum term (ie, δ i k k 2 s i* in Equation 3) accelerates convergence of deterministic components, while adaptive gradient concentrates over the stochastic components. Also, momentum leads to low complexity on the bound of deterministic part compared with the bound of adaptive gradient.…”
Section: Optimization In the 3d Cnnmentioning
confidence: 99%
“…Specically, a number of historical data is input to iteratively update parameters until convergence, and learning method utilized here is the RMSProp. 48 Due to the limitation of textual length, detailed iterative process is le out. Aer that, a complete prediction mechanism is established for outlet indexes.…”
Section: Decodingmentioning
confidence: 99%
“…After deep learning, especially the emergence of convolutional neural networks, this method of auto-learning features using convolutional neural networks has driven the rapid development of many visual tasks, including segmentation algorithms of medical exercise rehabilitation image. Hoseini et al [10] applied the study of deep learning algorithms to the segmentation of neural cell images. Xiao et al [11] applied the deep learning algorithm to the benign and malignant discrimination of breast tumors.…”
Section: Related Workmentioning
confidence: 99%