2023
DOI: 10.1016/j.commatsci.2022.111847
|View full text |Cite
|
Sign up to set email alerts
|

Segmentation of tomography datasets using 3D convolutional neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 30 publications
0
3
0
Order By: Relevance
“…The following set of hyperparameters was chosen due to their reported contribution to the model performance:• Learning rate (LR): This hyperparameter defines the amount by which the model weights are updated during backpropagation and control the speed of the learning process; it is recognised as one of the hyperparameters having the greatest impact on the model performance. 3,65 Considering the recommendations in the literature 3,66,67 and the range of values used in the studies presented in Table 1, three learning rates were evaluated: 10 −2 , 10 −3 , and 10 −4 .• Learning rate reduction factor: As the training progresses, the model may benefit from reducing the learning rate in order to find further optimal states that would have been overlooked by larger learning rates. 19 To account for this observation, the learning rate was automatically reduced by a user-defined factor if the control set loss did not improve during five consecutive epochs.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The following set of hyperparameters was chosen due to their reported contribution to the model performance:• Learning rate (LR): This hyperparameter defines the amount by which the model weights are updated during backpropagation and control the speed of the learning process; it is recognised as one of the hyperparameters having the greatest impact on the model performance. 3,65 Considering the recommendations in the literature 3,66,67 and the range of values used in the studies presented in Table 1, three learning rates were evaluated: 10 −2 , 10 −3 , and 10 −4 .• Learning rate reduction factor: As the training progresses, the model may benefit from reducing the learning rate in order to find further optimal states that would have been overlooked by larger learning rates. 19 To account for this observation, the learning rate was automatically reduced by a user-defined factor if the control set loss did not improve during five consecutive epochs.…”
Section: Methodsmentioning
confidence: 99%
“…• Learning rate (LR): This hyperparameter defines the amount by which the model weights are updated during backpropagation and control the speed of the learning process; it is recognised as one of the hyperparameters having the greatest impact on the model performance. 3,65 Considering the recommendations in the literature 3,66,67 and the range of values used in the studies presented in Table 1, three learning rates were evaluated: 10 À2 , 10 À3 , and 10 À4 . • Learning rate reduction factor: As the training progresses, the model may benefit from reducing the learning rate in order to find further optimal states that would have been overlooked by larger learning rates.…”
Section: Cnn Hyperparameters Selectionmentioning
confidence: 99%
“…Like all such methods, it is trained on a small amount of labeled (i.e., pre-segmented) data and, once trained, can produce segmentation predictions. Deep learning architectures have also been extended to 3D and applied in medical and materials science fields, which fulfill the need to segment variable 3D structures [13][14][15][16][17] . Much effort [18][19][20][21][22] has been dedicated to exploiting machine learning to accelerate the analysis of CT data of batteries, most of which are focused on the electrodes, especially the cathodes.…”
Section: Introductionmentioning
confidence: 99%