2022
DOI: 10.3390/math10030337
|View full text |Cite
|
Sign up to set email alerts
|

SinLU: Sinu-Sigmoidal Linear Unit

Abstract: Non-linear activation functions are integral parts of deep neural architectures. Given the large and complex dataset of a neural network, its computational complexity and approximation capability can differ significantly based on what activation function is used. Parameterizing an activation function with the introduction of learnable parameters generally improves the performance. Herein, a novel activation function called Sinu-sigmoidal Linear Unit (or SinLU) is proposed. SinLU is formulated as SinLU(x)=(x+as… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 25 publications
0
6
0
Order By: Relevance
“…In 2022, scholars including Paul Ashis, Bandyopadhyay Rajarshi, Yoon Jin, Geem Zong Woo, and Sarkar Ram introduced the sinusoidal sigmoid linear unit (SinLU) activation function [15]. The SinLU activation function is designed with two trainable parameters: one controlling the amplitude of the sinusoidal function and the other regulating the frequency of the sine wave.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…In 2022, scholars including Paul Ashis, Bandyopadhyay Rajarshi, Yoon Jin, Geem Zong Woo, and Sarkar Ram introduced the sinusoidal sigmoid linear unit (SinLU) activation function [15]. The SinLU activation function is designed with two trainable parameters: one controlling the amplitude of the sinusoidal function and the other regulating the frequency of the sine wave.…”
Section: Resultsmentioning
confidence: 99%
“…The CosLU activation function is an enhancement derived from the sinusoidal Sigmoidal Linear Unit (SinLU) activation function proposed by Paul Ashis et al [23]. In contrast to the SinLU activation function, which utilizes the sine function, the CosLU activation function employs the cosine function as the periodic function.…”
Section: Coslumentioning
confidence: 99%
“…The Mish activation function, introduced by Misra (2019), provided a smooth, nonlinear alternative that excels in tasks like image classification, albeit with higher computational demands, as noted by Zhang et al (2021). The recent SinLU activation function by Paul et al (2022) further expanded the landscape by incorporating two trainable parameters and leveraging the periodicity of the sine function to introduce novel dynamics into neural network training.…”
Section: Related Workmentioning
confidence: 99%
“…The Backbone component is primarily based on the CBS, [24] C3, [25] and SPP [26] structures. The CBS structure consists of a 2D convolutional layer Conv2d, [27] a batch normalization procedure BatchNorm2d, [28] and an activation function SiLU [29] in sequence. The C3 structure is split into two branches: the upper branch passes through the standard convolution and the Bottleneck module, while the lower branch joins the standard convolution, is then spliced with the higher branch, and finally connects to another standard convolution.…”
Section: Yolov5 Algorithmmentioning
confidence: 99%