2018
DOI: 10.1016/j.procs.2018.01.118
|View full text |Cite
|
Sign up to set email alerts
|

Classification and Recognition of 3D Image of Charlier moments using a Multilayer Perceptron Architecture

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
3

Relationship

2
7

Authors

Journals

citations
Cited by 26 publications
(7 citation statements)
references
References 13 publications
0
7
0
Order By: Relevance
“…Output signals of the last layer neurons are the response of the whole network. The number of neurons in this layer depends on the addressed problem (Amakdouf et al, 2018;Goodfellow et al, 2016). For example, in the issue considered in this paper, the price of real estate is estimated.…”
Section: Arti Cial Neural Networkmentioning
confidence: 99%
“…Output signals of the last layer neurons are the response of the whole network. The number of neurons in this layer depends on the addressed problem (Amakdouf et al, 2018;Goodfellow et al, 2016). For example, in the issue considered in this paper, the price of real estate is estimated.…”
Section: Arti Cial Neural Networkmentioning
confidence: 99%
“…If k = 2 (shown in Figure 9c), (even number) it is difficult to classify the object because of achieving the same score of two classes labels. Orthogonal moments face recognition [143], object classification [144], [145], object recognition [128], [146], texture retrieval [147] • strong signal descriptors with low order elements [129] • computationally expensive [129] Krawtchouk polynomials object recognition [130], edge detection [148], object classification [149], image recognition [150] • better performance for reconstruction error [23], [130] • high computational time [23] Tchebichef polynomials image analysis [131], face Recognition [151], edge detection [132], image retrieval [152] • eliminates the necessity for numerical approximation in the image discrete domain [131], [133] • vulnerable coefficients' calculation to numerical instability for higher polynomial order [132] Charlier polynomials object recognition [153], image classification [154], image reconstruction [155], object recognition [156] • minimizes both the time of computation and the error of propagation [136] • coefficient's numerical inconsistency for higher-order polynomials [136] SKTP face detection [140] • stable in noisy environments [140] • no clear information for handling major occlusion [140] B. DECISION TREE Decision trees are developed for classification tasks.…”
Section: Classifiers a Nearest Neighbour Classifiermentioning
confidence: 99%
“…The weights and biases are initialized to small random values, and by stochastic gradient descent (SGD), they are iteratively updated on the training samples until the squared error becomes minimum. (22) 2 2…”
Section: Mlpmentioning
confidence: 99%