2022 International Conference on ICT for Sustainability (ICT4S) 2022
DOI: 10.1109/ict4s55073.2022.00015
|View full text |Cite
|
Sign up to set email alerts
|

Data-Centric Green AI An Exploratory Empirical Study

Abstract: With the growing availability of large-scale datasets, and the popularization of affordable storage and computational capabilities, the energy consumed by AI is becoming a growing concern. To address this issue, in recent years, studies have focused on demonstrating how AI energy efficiency can be improved by tuning the model training strategy. Nevertheless, how modifications applied to datasets can impact the energy consumption of AI is still an open question.To fill this gap, in this exploratory study, we ev… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
25
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 40 publications
(25 citation statements)
references
References 31 publications
0
25
0
Order By: Relevance
“…Albeit the training phase is intuitively the most energy-greedy phase, this results calls for a word of caution. From recent results (e.g., a study on data-centric Green AI [98]) the inference phase results to consume only a negligible fraction of the energy consumed in the training phase. Nevertheless, given the high execution rate of the inference phase, how the energy consumed by the infrequent execution of the training phase compares to one of the highly executed inference phase is still an open question.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Albeit the training phase is intuitively the most energy-greedy phase, this results calls for a word of caution. From recent results (e.g., a study on data-centric Green AI [98]) the inference phase results to consume only a negligible fraction of the energy consumed in the training phase. Nevertheless, given the high execution rate of the inference phase, how the energy consumed by the infrequent execution of the training phase compares to one of the highly executed inference phase is still an open question.…”
Section: Discussionmentioning
confidence: 99%
“…Data-centric approaches for Green AI show that feature selection and subsampling techniques can significantly reduce the energy consumption of training machine learning models [98]. Subsampling strategies can be more sophisticated by removing data points that are expected to be redundant in terms of knowledge acquisition [35].…”
Section: Green Ai Topicsmentioning
confidence: 99%
See 1 more Smart Citation
“…Data-centric approaches for Green AI show that feature selection and subsampling techniques can significantly reduce the energy consumption of training machine learning models (Verdecchia et al, 2022b). Subsampling strategies can be more sophisticated by removing data points that are expected to be redundant in terms of knowledge acquisition (Dhabe et al, 2021).…”
Section: Green Ai Topicsmentioning
confidence: 99%
“…Out of all Green AI strategies, among the ones which report concrete saving percentages, a technique based on structure simplification for deep neural networks results to save more energy, amounting to 115% energy savings (Zhang et al, 2018a). The other techniques which result to optimize energy the most are based on quantizing the inputs of decision trees (Abreu et al, 2020) (97% energy savings), using data-centric Green AI techniques (Verdecchia et al, 2022b) (92% energy savings), and leveraging efficient deployment of AI algorithms via virtualized cloud fog networks (91% energy savings) (Yosuf et al, 2021). Overall, more…”
Section: Energy Savingsmentioning
confidence: 99%