2020
DOI: 10.48550/arxiv.2009.03863
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

TanhSoft -- a family of activation functions combining Tanh and Softplus

Abstract: Deep learning at its core, contains functions that are composition of a linear transformation with a non-linear function known as activation function. In past few years, there is an increasing interest in construction of novel activation functions resulting in better learning. In this work, we propose a family of novel activation functions, namely TanhSoft, with four undetermined hyper-parameters of the form tanh (αx + βe γx ) ln(δ + e x ) and tune these hyper-parameters to obtain activation functions which ar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 8 publications
0
3
0
Order By: Relevance
“…The TanhSoft is a family of AAFs proposed in [461] that combine the softplus and tanh that contains three notable cases -TanhSoft-1, TanhSoft-2, and TanhSoft-3 [461,462].…”
Section: Tanhsoftmentioning
confidence: 99%
See 2 more Smart Citations
“…The TanhSoft is a family of AAFs proposed in [461] that combine the softplus and tanh that contains three notable cases -TanhSoft-1, TanhSoft-2, and TanhSoft-3 [461,462].…”
Section: Tanhsoftmentioning
confidence: 99%
“…where b i and c i are trainable parameters [461,462]. The TanhSoft-2 can be obtained from the general TanhSoft by setting a i = 0 and d i = 0 [461].…”
Section: Tanhsoftmentioning
confidence: 99%
See 1 more Smart Citation