DOI: 10.5821/dissertation-2117-114007
|View full text |Cite
|
Sign up to set email alerts
|

Improving heterogeneous system efficiency : architecture, scheduling, and machine learning

Daniel A. Nemirovsky

Abstract: Computer architects are beginning to embrace heterogeneous systems as an effective method to utilize increases in transistor densities for executing a diverse range of workloads under varying performance and energy constraints. As heterogeneous systems become more ubiquitous, architects will need to develop novel CPU scheduling techniques capable of exploiting the diversity of computational resources. In recognizing hardware diversity, state-of-the-art heterogeneous schedulers are able to produce significant p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 83 publications
(135 reference statements)
0
3
0
Order By: Relevance
“…This pre-processing stage is often necessary for enhancing the precision of prediction across specific ranges of data sets and making the learning process more reliable. The input samples are divided into the following three major sets-training data is utilized to fit the target design; validation data is used to enhance the hyperparameters of the model and overfit the performance by enhancing their biasing parameters; testing samples are used to validate the final trained model [19]. In the case of noisy data, the use of soft computing ANN technique can provide a more effective optimization strategy.…”
Section: Learning-based Supervised Predictionmentioning
confidence: 99%
See 2 more Smart Citations
“…This pre-processing stage is often necessary for enhancing the precision of prediction across specific ranges of data sets and making the learning process more reliable. The input samples are divided into the following three major sets-training data is utilized to fit the target design; validation data is used to enhance the hyperparameters of the model and overfit the performance by enhancing their biasing parameters; testing samples are used to validate the final trained model [19]. In the case of noisy data, the use of soft computing ANN technique can provide a more effective optimization strategy.…”
Section: Learning-based Supervised Predictionmentioning
confidence: 99%
“…Some of the enhancement methods have been extended to sources for memory structures based on ML, as highlighted in [19,[37][38][39][40]. To simulate task performance through IPC and cache accesses for heterogeneous chip multi cores, Daniel [19] used ANN learning models.…”
Section: Power Management For the Memory Systemsmentioning
confidence: 99%
See 1 more Smart Citation