2020
DOI: 10.1177/1468087420974148
|View full text |Cite
|
Sign up to set email alerts
|

Deep feature learning of in-cylinder flow fields to analyze cycle-to-cycle variations in an SI engine

Abstract: Machine learning (ML) models based on a large data set of in-cylinder flow fields of an IC engine obtained by high-speed particle image velocimetry allow the identification of relevant flow structures underlying cycle-to-cycle variations of engine performance. To this end, deep feature learning is employed to train ML models that predict cycles of high and low in-cylinder maximum pressure. Deep convolutional autoencoders are self-supervised-trained to encode flow field features in low dimensional latent space.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
10
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
8
1

Relationship

4
5

Authors

Journals

citations
Cited by 18 publications
(11 citation statements)
references
References 57 publications
1
10
0
Order By: Relevance
“…Using unlabeled data and pre-trained models: Labeling data could be very expensive and might limit the available data set size. Unlabeled data might be exploited in the training process, e.g., by performing unsupervised pre-training [107] and semi-supervised learning algorithms [108][109][110].Complementary transfer learning could be used to pre-train the network on a proxy data set (e.g., from simulations) that resembles the original data to extract common features [111].…”
Section: Modelingmentioning
confidence: 99%
“…Using unlabeled data and pre-trained models: Labeling data could be very expensive and might limit the available data set size. Unlabeled data might be exploited in the training process, e.g., by performing unsupervised pre-training [107] and semi-supervised learning algorithms [108][109][110].Complementary transfer learning could be used to pre-train the network on a proxy data set (e.g., from simulations) that resembles the original data to extract common features [111].…”
Section: Modelingmentioning
confidence: 99%
“…Unlabeled data might be exploited in the training process, e.g. by performing unsupervised pre-training [87,88] and semi-supervised learning algorithms [89,90]. Complementary, transfer learning could be used to pre-train the network on a proxy data set (e.g.…”
Section: Model Complexitymentioning
confidence: 99%
“…With the dawn of significant improvements in advanced imaging, laser, and computing technologies over the last decade, engine CCV has been studied with detailed spatial and temporal scales. Recent experimental studies have related the in-cylinder velocity to fired engine CCV (Buschbeck et al 2012;Zeng et al 2015Zeng et al , 2019Bode et al 2017Bode et al , 2019Schiffmann et al 2017;Fach et al 2022;Dreher et al 2021;Hanuschkin et al 2020) using high-speed particle image velocimetry (PIV) combined with other techniques such as in-cylinder pressure measurements as well as flame and spark imaging. In particular, Buschbeck et al and Zeng et al studied homogenous mixtures with varying equivalence ratios to examine the influence of the flame speed on cyclic performance; it was shown that large-scale flow structures can have a significant effect on the flame development and subsequent combustion speed, a finding which is amplified when the flame speed is slower (Buschbeck et al 2012;Zeng et al 2019).…”
Section: Introductionmentioning
confidence: 99%