2021
DOI: 10.3390/mi12020214
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid Deep Recurrent Neural Networks for Noise Reduction of MEMS-IMU with Static and Dynamic Conditions

Abstract: Micro-electro-mechanical system inertial measurement unit (MEMS-IMU), a core component in many navigation systems, directly determines the accuracy of inertial navigation system; however, MEMS-IMU system is often affected by various factors such as environmental noise, electronic noise, mechanical noise and manufacturing error. These can seriously affect the application of MEMS-IMU used in different fields. Focus has been on MEMS gyro since it is an essential and, yet, complex sensor in MEMS-IMU which is very … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
27
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 57 publications
(28 citation statements)
references
References 55 publications
1
27
0
Order By: Relevance
“…For real-time online signal processing, the results of the proposed scheme are compared with small-scale networks, and the use of MLP network and leaky ReLU activation function with simple calculations are sufficient to achieve the corresponding error suppression effect. From Table 2, when smallscale recurrent neural networks are used [23,24], the error suppression effect obtained is similar to that of the MLP network used in this article, and with large-scale networks [22,26], the error suppression effect obtained is better than that of the proposed method. The result of the comparison shows that the compensation effect of the adopted scheme at the algorithm level is effective and reasonable.…”
Section: Implementation Details and Experimental Resultssupporting
confidence: 59%
See 4 more Smart Citations
“…For real-time online signal processing, the results of the proposed scheme are compared with small-scale networks, and the use of MLP network and leaky ReLU activation function with simple calculations are sufficient to achieve the corresponding error suppression effect. From Table 2, when smallscale recurrent neural networks are used [23,24], the error suppression effect obtained is similar to that of the MLP network used in this article, and with large-scale networks [22,26], the error suppression effect obtained is better than that of the proposed method. The result of the comparison shows that the compensation effect of the adopted scheme at the algorithm level is effective and reasonable.…”
Section: Implementation Details and Experimental Resultssupporting
confidence: 59%
“…The computational complexity of the error compensation scheme based on neural networks is large, which is a challenge for real-time online applications. Compared with related works [22][23][24][25][26], the network model and activation function used in this paper are simpler and less complex. When the number of neurons in the input layer and the number of neurons in the hidden layer are N and H, the shape of the input vector is (1, N), the shape of the hidden layer parameter matrix is (N, H), and the shape of the hidden layer feature matrix is (1, N).…”
Section: Circuit-level Realization and Analysis Of The Error Compensation Schemementioning
confidence: 99%
See 3 more Smart Citations