2020
DOI: 10.35940/ijitee.c8879.019320
|View full text |Cite
|
Sign up to set email alerts
|

Crop Yield Prediction using Gradient Boosting Regression

Pratyush Mishra*,
Rahil Khan,
Dr. B. Baranidharan

Abstract: Achieving greater crop yields remains a pressing challenge for both farmers and governments. This research examines the use and implementation of Gradient Boosting Regression in predicting crop yields for numerous districts in France. XGBoost, an efficient, optimized and flexible distributed gradient boosting library was used. Agricultural data was sourced from the CLAND Institute’s ‘Crop Data Challenge 2018’, which contains approximately 38 years of maize data compiled by various departments from the months o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(2 citation statements)
references
References 7 publications
0
2
0
Order By: Relevance
“…The limitation of the paper in its reliance on a single predictive model and data from a single country, suggesting the need for future studies to explore other machine learning algorithms and expand the scope of the research to other regions. P. Mishra et al [34] used Gradient Boosting Regression to improve the prediction of crop yields for districts in France. The model showed a R-squared score of 0.51 which was significantly better than other models, namely Ada Boosting, KNN, Linear and Random Forests.…”
Section: Literature Reviewmentioning
confidence: 99%
“…The limitation of the paper in its reliance on a single predictive model and data from a single country, suggesting the need for future studies to explore other machine learning algorithms and expand the scope of the research to other regions. P. Mishra et al [34] used Gradient Boosting Regression to improve the prediction of crop yields for districts in France. The model showed a R-squared score of 0.51 which was significantly better than other models, namely Ada Boosting, KNN, Linear and Random Forests.…”
Section: Literature Reviewmentioning
confidence: 99%
“…XGBoost incorporates enhancements that contribute to its robust performance, including regularization techniques, better processing of missing values, and the ability to handle non-linear relationships more effectively [85]. For example, Khan et al [119] evaluated the performance of GBM and XGBoost, along with other machine learning algorithms, in maize yield prediction across France using meteorological data. The study demonstrated that XGBoost outperformed GBM, achieving an R 2 of 0.51 compared to an R 2 of 0.17 for GBM.…”
Section: Accuracy Assessment and Influence Of Featuresmentioning
confidence: 99%