2020
DOI: 10.1016/j.softx.2020.100548
|View full text |Cite
|
Sign up to set email alerts
|

Mcfly: Automated deep learning on time series

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 36 publications
(27 citation statements)
references
References 16 publications
0
27
0
Order By: Relevance
“…Below we provide a description of four architectures used for time series classification: Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNN), Residual Networks (ResNet) and Inception Time Networks (InceptionTime). These architectures were chosen because they are widely adopted for time series classification and because they are available in mcfly (the software we use here for model implementation; van Kuppevelt et al, 2020).…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…Below we provide a description of four architectures used for time series classification: Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNN), Residual Networks (ResNet) and Inception Time Networks (InceptionTime). These architectures were chosen because they are widely adopted for time series classification and because they are available in mcfly (the software we use here for model implementation; van Kuppevelt et al, 2020).…”
Section: Methodsmentioning
confidence: 99%
“…Deep learning models can be implemented using several programming languages and specialised libraries (see Christin et al, 2019 for a review). Here, we use mcfly, a Python package for time series classification using deep learning (van Kuppevelt et al, 2020). This package is aimed at non-experts and it should be easy to use for ‘mid-level’ ecological modellers.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…A random search over the hyperparameters, using the Mcfly software package [33], was performed to generate 50 sets of convolutional neural network (CNN) [34] models and 50 sets of deep learning convolutional, long-short-term, memory (DeepConvLSTM) [35] models. The randomly selected hyperparameters were regularization rate, learning rate, number of convolution layers, number of filters per layer, number of hidden nodes, and number of long-short-term memory (LSTM) layers (only for DeepConvLSTM) [33]. Some 70% of the training/validation dataset was randomly used to train the total of 100 models, and the remaining 30% was used to evaluate the models.…”
Section: Deep Learning Algorithm Training and Selectionmentioning
confidence: 99%