2022
DOI: 10.2352/ei.2022.34.16.avm-118
|View full text |Cite
|
Sign up to set email alerts
|

Open source deep learning inference libraries for autonomous driving systems

Abstract: Automated driving functions, like highway driving and parking assist, are increasingly getting deployed in high-end cars with the goal of realizing self-driving car using Deep learning (DL) techniques like convolution neural network (CNN), Transformers etc. Traditionally custom software provided by silicon vendors are used to deploy these DL algorithms on devices. This custom software is optimal for given hardware, but supports limited features, resulting in-flexible for evaluating various deep learning model … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 4 publications
0
1
0
Order By: Relevance
“…Open-source DL inference frameworks (Tensorflow Lite, ONNX runtime, TVM, NEO-AI-DLR etc.) [5][6][7][8] have been developed to improve the ease of DL model deployment with offload mechanisms to DL accelerators [9][10][11]. The memory throughput requirements of DL engines are very high as these engines are designed to perform huge number of compute operations per cycles.…”
Section: Deep Learning Application Development Workflowmentioning
confidence: 99%
“…Open-source DL inference frameworks (Tensorflow Lite, ONNX runtime, TVM, NEO-AI-DLR etc.) [5][6][7][8] have been developed to improve the ease of DL model deployment with offload mechanisms to DL accelerators [9][10][11]. The memory throughput requirements of DL engines are very high as these engines are designed to perform huge number of compute operations per cycles.…”
Section: Deep Learning Application Development Workflowmentioning
confidence: 99%