2020
DOI: 10.1109/tnnls.2020.2975051
|View full text |Cite
|
Sign up to set email alerts
|

A Universal Approximation Result for Difference of Log-Sum-Exp Neural Networks

Abstract: We show that a neural network whose output is obtained as the difference of the outputs of two feedforward networks with exponential activation function in the hidden layer and logarithmic activation function in the output node (LSE networks) is a smooth universal approximator of continuous functions over convex, compact sets. By using a logarithmic transform, this class of networks maps to a family of subtractionfree ratios of generalized posynomials, which we also show to be universal approximators of positi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 36 publications
(22 citation statements)
references
References 26 publications
0
22
0
Order By: Relevance
“…In any adjunction, δ is a dilation and ε is an erosion. The double inequality ( 19) is equivalent to the inequality (17) satisfied by a residuation pair of increasing operators if we identify the residuated map ψ with δ and its residual ψ with ε. Furthermore, from (19) or (17), it follows that any adjunction (δ, ε) automatically yields an opening α = δε and a closing β = εδ, where the composition of two operators is written as an operator product.…”
Section: E L E M E N T S O F M a X-p L U S A L G E B R A W E I G H T E D L A T T I C E S A N D M O N O T O N E O P E R A T O R S A Lattmentioning
confidence: 99%
See 3 more Smart Citations
“…In any adjunction, δ is a dilation and ε is an erosion. The double inequality ( 19) is equivalent to the inequality (17) satisfied by a residuation pair of increasing operators if we identify the residuated map ψ with δ and its residual ψ with ε. Furthermore, from (19) or (17), it follows that any adjunction (δ, ε) automatically yields an opening α = δε and a closing β = εδ, where the composition of two operators is written as an operator product.…”
Section: E L E M E N T S O F M a X-p L U S A L G E B R A W E I G H T E D L A T T I C E S A N D M O N O T O N E O P E R A T O R S A Lattmentioning
confidence: 99%
“…The double inequality ( 19) is equivalent to the inequality (17) satisfied by a residuation pair of increasing operators if we identify the residuated map ψ with δ and its residual ψ with ε. Furthermore, from (19) or (17), it follows that any adjunction (δ, ε) automatically yields an opening α = δε and a closing β = εδ, where the composition of two operators is written as an operator product. Viewing (δ, ε) as an adjunction instead of a residuation pair has the advantage of the additional geometrical intuition and visualization afforded by the dilation and erosion operators in image and shape analysis.…”
Section: E L E M E N T S O F M a X-p L U S A L G E B R A W E I G H T E D L A T T I C E S A N D M O N O T O N E O P E R A T O R S A Lattmentioning
confidence: 99%
See 2 more Smart Citations
“…As h → 0, the soft versions approach the hard ones: lim h→0 x ∨ h y = x ∨ y and lim h→0 x ∧ h y = x ∧ y. For small positive values of h, these approximations are part of the Log-Sum-Exp family, used in convex analysis and recently linked to tropical polynomials [25,26]. We use the reciprocal of h, the hardness parameter β = h −1 .…”
Section: Background Conceptsmentioning
confidence: 99%