2021
DOI: 10.1002/int.22620
|View full text |Cite
|
Sign up to set email alerts
|

A long short‐term memory‐based model for greenhouse climate prediction

Abstract: Greenhouses can grow many off‐season vegetables and fruits, which improves people's quality of life. Greenhouses can also help crops resist natural disasters and ensure the stable growth of crops. However, it is highly challenging to carefully control the greenhouse climate. Therefore, the proposal of a greenhouse climate prediction model provides a way to solve this challenge. We focus on the six climatic factors that affect crops growth, including temperature, humidity, illumination, carbon dioxide concentra… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
73
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 136 publications
(73 citation statements)
references
References 48 publications
0
73
0
Order By: Relevance
“…In addition, Mahalanobis Distance requires additional time cost to compute the covariance matrix of different dimensions; therefore, its time complexity is not very low. While time cost is critical for real world applications especially for the big data scenario [29][30][31][32][33][34][35] . Therefore, we would continuously refine our algorithm to further reduce its time costs so as to meet the quick response requirements from users.…”
Section: Discussionmentioning
confidence: 99%
“…In addition, Mahalanobis Distance requires additional time cost to compute the covariance matrix of different dimensions; therefore, its time complexity is not very low. While time cost is critical for real world applications especially for the big data scenario [29][30][31][32][33][34][35] . Therefore, we would continuously refine our algorithm to further reduce its time costs so as to meet the quick response requirements from users.…”
Section: Discussionmentioning
confidence: 99%
“…These architectures use sequential patterns and associations between words by treating a sentence as a series of tokens to predict sentiments specific categories such as positive or negative. LSTM can extract text context dependencies better than RNN 10 , 37 . Still, it faces significant challenges in weighting the word order, i.e., a future text has a more significant impact on the text representation than the preceding one.…”
Section: Proposed Model Architecturementioning
confidence: 99%
“…A word embedding is a learned depiction of texts, where words with identical meanings have a similar representation. The layer learns the representation of individual input words in a text having a unique identification by initializing with random weights 10 , 37 . In this study Python Keras library that offers a framework for embedding layer was implemented.…”
Section: Proposed Model Architecturementioning
confidence: 99%
“…Artificial intelligence, as the theoretical basis of deep learning, has been updated by researchers in recent years. Liu et al 13 proposed A long-term memory-based model for greenhouse climate prediction. He used long short-term memory (LSTM) model to capture the dependence between historical climate data.…”
Section: Related Workmentioning
confidence: 99%