2018
DOI: 10.1007/978-3-030-01246-5_4
|View full text |Cite
|
Sign up to set email alerts
|

Lifting Layers: Analysis and Applications

Abstract: The great advances of learning-based approaches in image processing and computer vision are largely based on deeply nested networks that compose linear transfer functions with suitable non-linearities. Interestingly, the most frequently used nonlinearities in imaging applications (variants of the rectified linear unit) are uncommon in low dimensional approximation problems. In this paper we propose a novel nonlinear transfer function, called lifting, which is motivated from a related technique in convex optimi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(7 citation statements)
references
References 16 publications
0
7
0
Order By: Relevance
“…Nonmonotone activation functions are rarely found in standard CNNs, with some notable exceptions [20,38,76]. Recently, the so-called Swish activation [84] and modifications thereof [72,112] have been found to empirically boost the classification performance of CNNs.…”
Section: Related Workmentioning
confidence: 99%
“…Nonmonotone activation functions are rarely found in standard CNNs, with some notable exceptions [20,38,76]. Recently, the so-called Swish activation [84] and modifications thereof [72,112] have been found to empirically boost the classification performance of CNNs.…”
Section: Related Workmentioning
confidence: 99%
“…The diffusion interpretation suggests that activation functions should be learned in the same manner as convolution weights and biases. In practice, this hardly happens apart from a few notable exceptions such as [3,7,17]. As nonmonotone flux functions outperform monotone ones in the diffusion setting, it appears promising to incorporate them into CNNs.…”
Section: Nonmonotone Activation Functionsmentioning
confidence: 99%
“…Nonmonotone activation functions are rarely found in standard CNNs, with some notable exceptions [18,35,72]. Recently, the so-called Swish activation [80] and modifications thereof [68,108] have been found to empirically boost the classification performance of CNNs.…”
Section: Related Workmentioning
confidence: 99%
“…In practice, this hardly happens apart from a few notable exceptions such as [18,35,72]. Recently, activation functions which are slightly nonmonotone variants of the ReLU proved successful for image classification tasks [68,80,108].…”
Section: Relu Activation Charbonnier Activationmentioning
confidence: 99%