IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)
DOI: 10.1109/ijcnn.1999.831572
|View full text |Cite
|
Sign up to set email alerts
|

Neural networks in non-Euclidean metric spaces

Abstract: Abstract.Multilayer Perceptrons (MLPs)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 17 publications
0
4
0
Order By: Relevance
“…When rotational angle is included in the pose space, the appearance manifold possesses the topology of a solid torus S × R 2 . Learning algorithms to estimate non-Euclidean manifolds are still very underdeveloped, and methods that use self-organized maps or neural networks are not particularly efficient with appearance images [12], [5]. We are investigating how a method such as locally linear projection may be extended to handle these more challenging situations.…”
Section: Discussionmentioning
confidence: 99%
“…When rotational angle is included in the pose space, the appearance manifold possesses the topology of a solid torus S × R 2 . Learning algorithms to estimate non-Euclidean manifolds are still very underdeveloped, and methods that use self-organized maps or neural networks are not particularly efficient with appearance images [12], [5]. We are investigating how a method such as locally linear projection may be extended to handle these more challenging situations.…”
Section: Discussionmentioning
confidence: 99%
“…• Perceptron nodes based on sigmoidal functions with scalar product or distancebased activations [77,78], as in layers of MLP networks, but with targets specified by some criterion (any criterion used for linear transformations is sufficient).…”
Section: Other Non-linear Mappingsmentioning
confidence: 99%
“…-Weighted distance-based transformations, a special case of general kernel transformations, that use (optimized) reference vectors [43]. -Perceptron nodes based on sigmoidal functions with scalar product or distancebased activations [42,41], as in layers of MLP networks, but with targets specified by some criterion (any criterion used for linear transformations is sufficient). -Heterogeneous transformations using several types of kernels to capture details at different resolution [57].…”
Section: Extracting Information From All Featuresmentioning
confidence: 99%