SAE Technical Paper Series 1998
DOI: 10.4271/980790
|View full text |Cite
|
Sign up to set email alerts
|

SI Engine Modeling Using Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
12
0

Year Published

2006
2006
2021
2021

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 40 publications
(12 citation statements)
references
References 13 publications
0
12
0
Order By: Relevance
“…A well known architecture is the multi layer perceptron architecture (MLP). The equations presenting the network are given by (1). Where y is the output of the network to behave like the process output and x is the input vector including all the process variables influencing the output y.…”
Section: A Diagonal Recurrent Neural Networkmentioning
confidence: 99%
See 3 more Smart Citations
“…A well known architecture is the multi layer perceptron architecture (MLP). The equations presenting the network are given by (1). Where y is the output of the network to behave like the process output and x is the input vector including all the process variables influencing the output y.…”
Section: A Diagonal Recurrent Neural Networkmentioning
confidence: 99%
“…The following variables have been used as inputs for the engine torque estimation based on the diagonal recurrent neural network architecture presented above: engine speed, manifold pressure, valve actuation, EGR (Exhaust Gas Recirculation) actuation, ignition timing and fuel injection [1], [2].…”
Section: Engine Torque Modelmentioning
confidence: 99%
See 2 more Smart Citations
“…Brahma et al [22] used the ANN models to optimize control variables of the diesel engine, although with some challenges regarding noise in optimization results. On-Board Diagnostics (OBD) [23,24] and Hardware-in-the-Loop (HIL) simulation [25][26][27] exploited ANNs computational efficiency. Using ANN's for estimating mass air flow rate through the VVT engine has been demonstrated by Wu et al in [28].…”
Section: Introductionmentioning
confidence: 99%