2020
DOI: 10.3390/math8040512
|View full text |Cite
|
Sign up to set email alerts
|

Reduced Dilation-Erosion Perceptron for Binary Classification

Abstract: Dilation and erosion are two elementary operations from mathematical morphology, a non-linear lattice computing methodology widely used for image processing and analysis. The dilation-erosion perceptron (DEP) is a morphological neural network obtained by a convex combination of a dilation and an erosion followed by the application of a hard-limiter function for binary classification tasks. A DEP classifier can be trained using a convex-concave procedure along with the minimization of the hinge loss function. A… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
18
0
3

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(21 citation statements)
references
References 60 publications
0
18
0
3
Order By: Relevance
“…Valle proposes a greedy algorithm where the dilation and the erosion perceptrons are trained separately and combined later by minimizing the average hinge loss [31]. This method allows the inclusion of a regularization term C u − r 1 in the objective function, where u = w or u = m and r is a reference term.…”
Section: Training Morphological Network Via Convex-concave Proceduresmentioning
confidence: 99%
See 1 more Smart Citation
“…Valle proposes a greedy algorithm where the dilation and the erosion perceptrons are trained separately and combined later by minimizing the average hinge loss [31]. This method allows the inclusion of a regularization term C u − r 1 in the objective function, where u = w or u = m and r is a reference term.…”
Section: Training Morphological Network Via Convex-concave Proceduresmentioning
confidence: 99%
“…The Dilation-Erosion Perceptron suffers from a major flaw as a lattice-based model, it presupposes a partial ordering both on the features and the classes. By simply inverting the classes N P, the performance of the classifier might severely drop [31]. A way to counteract this behavior lies on the use of reduced morphological operators based on a reduced ordering: Definition 2.…”
Section: Training Morphological Network Via Convex-concave Proceduresmentioning
confidence: 99%
“…Neural networks can be divided into feedforward neural networks (perceptron, back propagation neural network (BPNN), and convolutional neural network (CNN)), feedback neural networks (Elman and Hopfield neural networks), and self-organizing map (SOM) neural networks according to the interconnection of neurons in the network. The perceptron neural network can only handle linear problems [54]. BP neural network adds the number of layers and BP algorithm on the basis of the perceptron, and has strong nonlinear mapping and optimization computing ability [55].…”
Section: Introductionmentioning
confidence: 99%
“…Hence, we take advantage of the recent advances in deep, sparse and non-negative auto-encoders to design a new framework able to learn part-based representations of an image database, compatible with morphological processing. To that extent, this work is part of the resurgent research line investigating interactions between deep learning and mathematical morphology [9,22,23,27,32]. However with respect to these studies, focusing mainly on introducing morphological operators in neural networks, the present paper addresses a different question.…”
Section: Introductionmentioning
confidence: 99%