2020
DOI: 10.1007/978-3-030-18338-7_18
|View full text |Cite
|
Sign up to set email alerts
|

Machine Learning at the Edge

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 12 publications
(7 citation statements)
references
References 48 publications
0
7
0
Order By: Relevance
“…The efficiency of the required calculation is such that trained NNs can be converted into compact and efficient TensorFlow Lite models, with a size of approximately 90 kB. These models are suitable for deployment and execution on microcontrollers, which would minimize resource demands and enable data collection and estimation on the same device, leading to a significant reduction in latency [49,50]. We can expect that, during data collection of photon-counting measurements, different chunks of measured time-delays are fed into the trained NN models for inference in real time.…”
Section: Computation Efficiencymentioning
confidence: 99%
See 1 more Smart Citation
“…The efficiency of the required calculation is such that trained NNs can be converted into compact and efficient TensorFlow Lite models, with a size of approximately 90 kB. These models are suitable for deployment and execution on microcontrollers, which would minimize resource demands and enable data collection and estimation on the same device, leading to a significant reduction in latency [49,50]. We can expect that, during data collection of photon-counting measurements, different chunks of measured time-delays are fed into the trained NN models for inference in real time.…”
Section: Computation Efficiencymentioning
confidence: 99%
“…Generally, the Bayesian parameter estimation method will be computationally time-consuming, even in simple systems. This hampers the prospects for real-time estimation and for the integration of the inference process in the actual measurement device taking the data, which would be desirable for reduced latency and energy consumption [49,50].…”
Section: Introductionmentioning
confidence: 99%
“…This on-device processing capability is also a cornerstone of edge computing, where data are processed locally, enabling real-time data analysis and decision making. NPUs are instrumental in this paradigm, offering the necessary computational power to handle complex AI tasks at the edge, close to where data are generated [71,[93][94][95].…”
Section: Neural Processing Units (Npus)mentioning
confidence: 99%
“…The concept of employing machine learning techniques directly on embedded end devices to perform on-device sensor data analytics, so-called edge machine learning, has recently gained a lot of attention (see [ 24 , 25 , 26 ] for reviews). It enables the development of intelligent sensor systems with extremely low power consumption that do not need to exchange data with a cloud server (where machine learning algorithms would traditionally be performed).…”
Section: Introductionmentioning
confidence: 99%