Deep Neural Networks (DNNs) have achieved great success in many applications, such as image classification, natural language processing and speech recognition. The architectures of DNNs have been proved to play a crucial role in its performance. However, designing architectures for different tasks is a difficult and time-consuming process of trial and error. Neural Architecture Search (NAS), which received great attention in recent years, can design the architecture automatically. Among different kinds of NAS methods, Evolutionary Computation (EC) based NAS methods have recently gained much attention and success. Unfortunately, there is not a comprehensive summary of the EC-based methods. This paper reviews 100+ papers of EC-based NAS methods in light of the common process. Four steps of the process have been covered in this paper including population initialization, population operators, evaluation and selection. Furthermore, current challenges and issues are also discussed to identify future research in this field.
Deep Neural Networks (DNNs) have achieved great success in many applications, such as image classification, natural language processing and speech recognition. The architectures of DNNs have been proved to play a crucial role in its performance. However, designing architectures for different tasks is a difficult and time-consuming process of trial and error. Neural Architecture Search (NAS), which received great attention in recent years, can design the architecture automatically. Among different kinds of NAS methods, Evolutionary Computation (EC) based NAS methods have recently gained much attention and success. Unfortunately, there is not a comprehensive summary of the EC-based methods. This paper reviews 100+ papers of EC-based NAS methods in light of the common process. Four steps of the process have been covered in this paper including population initialization, population operators, evaluation and selection. Furthermore, current challenges and issues are also discussed to identify future research in this field.
“…The existing methods usually employ random search [6], grid search [7], reinforcement learning [3], Bayesian optimization [8], evolutionary algorithms [9] and gradient-based methods [10] to explore the space of neural architectures. Although they give rise to a large number of studies for reporting more accurate classifiers, researchers are still faced with the challenge of computationally expensive simulations.…”
Neural architecture search (NAS) has achieved great success in different computer vision tasks such as object detection and image recognition. Moreover, deep learning models have millions or billions of parameters and applying NAS methods when considering a small amount of data is not trivial. Unlike computer vision tasks, labeling time series data for supervised learning is a laborious and expensive task that often requires expertise. Therefore, this paper proposes a simple-yet-effective fine-tuning method based on repeated k-fold cross-validation in order to train deep residual networks using only a small amount of time series data. The main idea is that each model fitted during cross-validation will transfer its weights to the subsequent folds over the rounds. We conducted extensive experiments on 85 instances from the UCR archive for Time Series Classification (TSC) to investigate the performance of the proposed approach. The experimental results reveal that our proposed model called NAS-T reaches new state-of-the-art TSC accuracy, by designing a single classifier that is able to beat HIVE-COTE: an ensemble of 37 individual classifiers.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.