2018
DOI: 10.3390/s18041229
|View full text |Cite
|
Sign up to set email alerts
|

LiteNet: Lightweight Neural Network for Detecting Arrhythmias at Resource-Constrained Mobile Devices

Abstract: By running applications and services closer to the user, edge processing provides many advantages, such as short response time and reduced network traffic. Deep-learning based algorithms provide significantly better performances than traditional algorithms in many fields but demand more resources, such as higher computational power and more memory. Hence, designing deep learning algorithms that are more suitable for resource-constrained mobile devices is vital. In this paper, we build a lightweight neural netw… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
26
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 42 publications
(26 citation statements)
references
References 30 publications
0
26
0
Order By: Relevance
“…Other research efforts in building network architectures suitable for use on performance restricted environments such as IoT and smartphones have led to another category of models, specifically designed to be computationally efficient. State-of-the-art architectures of this type of models include MobileNet [3], MobileNetV2 [6], ShuffleNet [4], LiteNet [20] and EffNet [5].…”
Section: Related Workmentioning
confidence: 99%
“…Other research efforts in building network architectures suitable for use on performance restricted environments such as IoT and smartphones have led to another category of models, specifically designed to be computationally efficient. State-of-the-art architectures of this type of models include MobileNet [3], MobileNetV2 [6], ShuffleNet [4], LiteNet [20] and EffNet [5].…”
Section: Related Workmentioning
confidence: 99%
“…Most of the traditional solutions use sensors for physiological signal detection, for example, Polysomnography [6] and Electrocardiogram [7]. However, these programs are not suitable for the home environment.…”
Section: Introductionmentioning
confidence: 99%
“…The Adam optimizer, presented by Kingma and Ba [35], is extensively used for deep learning models requiring first-order gradient-based descent with small memory and the ability to compute adaptive learning rates for different parameters [36]. This method is computationally efficient, easy to implement, and has proven to perform better than the RMSprop and Rprop optimizers [37]. Gradient rescaling is reliant on the magnitudes of parameter updates.…”
Section: Feed Forward Neural Network (Fnn)mentioning
confidence: 99%