2020
DOI: 10.36227/techrxiv.12845753.v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Error Bounds for a Matrix-Vector Product Approximation with Deep ReLU Neural Networks

Abstract: Abstract—Inspired by the depth and breadth of developments on the theory of deep learning, we pose these fundamental questions: can we accurately approximate an arbitrary matrix-vector product using deep rectified linear unit (ReLU) feedforward neural networks (FNNs)? If so, can we bound the resulting approximation error? Attempting to answer these questions, we derive error bounds in Lebesgue and Sobolev norms for a matrix-vector product approximation with deep ReLU FNNs. Since a matrix-vector product models … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 84 publications
(193 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?