2021
DOI: 10.3837/tiis.2021.08.016
|View full text |Cite
|
Sign up to set email alerts
|

Self-Supervised Long-Short Term Memory Network for Solving Complex Job Shop Scheduling Problem

Abstract: and was selected as one outstanding paper. This version includes a concrete explanation of SS-LSTM and related works analysis.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 25 publications
0
3
0
Order By: Relevance
“…where , , are the neural network parameters, with 0 indicating complete abandonment and 1 indicating complete retention, and finally by multiplying with the cell state , the result of which determines which information the cell state is to abandon (i.e. forget) [8].…”
Section: Long Short-term Memory Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…where , , are the neural network parameters, with 0 indicating complete abandonment and 1 indicating complete retention, and finally by multiplying with the cell state , the result of which determines which information the cell state is to abandon (i.e. forget) [8].…”
Section: Long Short-term Memory Networkmentioning
confidence: 99%
“…PPO is a DRL algorithm [8] for training an agent to learn and perform tasks in sophisticated environments. The central idea of the algorithm is to enable the training of intelligences by PPO so that the gap between the old and new strategies is smaller [9] .…”
Section: Proximal Policy Optimizationmentioning
confidence: 99%
“…Tian et al [24] use the long short-term memory network (LSTM) that can retain the features with time steps to predict the production information and use the improved GA to achieve the scheduling. Shao et al [25] also build a model based on LSTM named self-supervised long-short term memory (SS-LSTM), which can deal with dynamic events effectively, but the hyperparameters in SS-LSTM are difficult to determine. All the above-mentioned deep learning models either need one module to preprocess the input of the neural network or use other modules to process the output of the neural network to get the final scheduling plan, which may aggravate the errors between each module and lead to the suboptimal scheduling plan.…”
Section: Introductionmentioning
confidence: 99%