The objective of this work is to evaluate the capacity of the C-band Synthetic Aperture Radar (SAR) time series imagery, acquired by the European satellite Sentinel-1 (S1), for the agriculture crop classification and its reliability to differentiate onion from sunflower, among others. The work then focused on classifying land cover in intensively cultivated agricultural regions. The study was developed in the Bonaerense Valley of the Colorado River (BVCR), Buenos Aires Province in Argentina, backed up by the field truth of 1634 field samples. In addition to the onion and sunflower crops, there are other crops present in the study area such as cereals, alfalfa, potatoes and maize, which are considered as the image background in the classification process. The field samples database was used for training and supporting a supervised classification with two machine learning algorithms—Random Forest (RF) and Support Vector Machine (SVM)—obtaining high levels of accuracy in each case. Different S1 SAR time-series features were used to assess the performance of S1 crop classification in terms of polarization VH+VV, Grey Level Co-occurrence Matrix (GLCM) image texture and a combination of both. The analysis of SAR data and their features was carried out at OBIA lot level (Object Based Image Analysis) showing an optimal strategy to counteract the effect of the residual and inherent speckle noise of the radar signal. In the process of differentiating onion and sunflower crops, the analysis of the VH+VV stack with the SVM algorithm delivered the best statistical classification results in terms of Overall Accuracy (OA) and Kappa Index, (Kp) when other crops (image background) were not considered (OA = 95.35%, Kp = 0.89). Certainly, the GLCM texture analysis derived from the S1 SAR images is a valuable source of information for obtaining very good classification results. When differentiating sunflower from onion considering also other crops present in the BVCR, the GLCM stack proved to be the most suitable dataset analyzed in this work (OA = 89.98%, Kp = 0.66 for SVM algorithm). This working methodology is applicable to other irrigated valleys in Argentina dedicated to intensive crops. There are also variables inherent to each lot, soil, crop and agricultural producer that differ according to the study area and that should be considered for each case in the future.
Earth observation offers an unprecedented opportunity to monitor intensively cultivated areas providing key support to assess fertilizer needs and crop water uptake. Routinely, vegetation traits mapping can help farmers to monitor plant development along the crop’s phenological cycle, which is particularly relevant for irrigated agricultural areas. The high spatial and temporal resolution of the Sentinel-2 (S2) multispectral instrument leverages the possibility to estimate leaf area index (LAI), canopy chlorophyll content (CCC), and vegetation water content (VWC) from space. Therefore, our study presents a hybrid retrieval workflow combining a physically-based strategy with a machine learning regression algorithm, i.e., Gaussian processes regression, and an active learning technique to estimate LAI, CCC and VWC of irrigated winter wheat. The established hybrid models of the three traits were validated against in-situ data of a wheat campaign in the Bonaerense valley, South of the Buenos Aires Province, Argentina, in the year 2020. We obtained good to highly accurate validation results with LAI: R2 = 0.92, RMSE = 0.43 m2 m−2, CCC: R2 = 0.80, RMSE = 0.27 g m−2 and VWC: R2 = 0.75, RMSE = 416 g m−2. The retrieval models were also applied to a series of S2 images, producing time series along the seasonal cycle, which reflected the effects of fertilizer and irrigation on crop growth. The associated uncertainties along with the obtained maps underlined the robustness of the hybrid retrieval workflow. We conclude that processing S2 imagery with optimised hybrid models allows accurate space-based crop traits mapping over large irrigated areas and thus can support agricultural management decisions.
Synthetic aperture radar (SAR) data provides an appealing opportunity for all-weather day or night Earth surface monitoring. The European constellation Sentinel-1 (S1) consisting of S1-A and S1-B satellites offers a suitable revisit time and spatial resolution for the observation of croplands from space. The C-band radar backscatter is sensitive to vegetation structure changes and phenology as well as soil moisture and roughness. It also varies depending on the local incidence angle (LIA) of the SAR acquisition’s geometry. The LIA backscatter dependency could therefore be exploited to improve the retrieval of the crop biophysical variables. The availability of S1 radar time-series data at distinct observation angles holds the feasibility to retrieve leaf area index (LAI) evolution considering spatiotemporal coverage of intensively cultivated areas. Accordingly, this research presents a workflow merging multi-date S1 smoothed data acquired at distinct LIA with a Gaussian processes regression (GPR) and a cross-validation (CV) strategy to estimate cropland LAI of irrigated winter wheat. The GPR-S1-LAI model was tested against in situ data of the 2020 winter wheat campaign in the irrigated valley of Colorador river, South of Buenos Aires Province, Argentina. We achieved adequate validation results for LAI with RCV2 = 0.67 and RMSECV = 0.88 m2 m−2. The trained model was further applied to a series of S1 stacked images, generating temporal LAI maps that well reflect the crop growth cycle. The robustness of the retrieval workflow is supported by the associated uncertainties along with the obtained maps. We found that processing S1 smoothed imagery with distinct acquisition geometries permits accurate radar-based LAI modeling throughout large irrigated areas and in consequence can support agricultural management practices in cloud-prone agri-environments.
The production of onions bulbs (Allium cepa L.) requires a high amount of nitrogen. According to the demand of sustainable agriculture, the information-development and communication technologies allow for improving the efficiency of nitrogen fertilization. In the south of the province of Buenos Aires, Argentina, between 8000 and 10,000 hectares per year−1 are cultivated in the districts of Villarino and Patagones. This work aimed to analyze the relationship of biophysical variables: leaf area index (LAI), canopy chlorophyll content (CCC), and canopy cover factor (fCOVER), with the nitrogen fertilization of an intermediate cycle onion crop and its effects on yield. A field trial study with different doses of granulated urea and granulated urea was carried out, where biophysical characteristics were evaluated in the field and in Sentinel-2 satellite observations. Field data correlated well with satellite data, with an R2 of 0.91, 0.96, and 0.85 for LAI, fCOVER, and CCC, respectively. The application of nitrogen in all its doses produced significantly higher yields than the control. The LAI and CCC variables had a positive correlation with yield in the months of November and December. A significant difference was observed between U250 (62 Mg ha−1) and the other treatments. The U500 dose led to a yield increase of 27% compared to U250, while the difference between U750 and U500 was 6%.
Optical Earth Observation is often limited by weather conditions such as cloudiness. Radar sensors have the potential to overcome these limitations, however, due to the complex radar-surface interaction, the retrieving of crop biophysical variables using this technology remains an open challenge. Aiming to simultaneously benefit from the optical domain background and the all-weather imagery provided by radar systems, we propose a data fusion approach focused on the cross-correlation between radar and optical data streams. To do so, we analyzed several multiple-output Gaussian processes (MOGP) models and their ability to fuse efficiently Sentinel-1 (S1) Radar Vegetation Index (RVI) and Sentinel-2 (S2) vegetation water content (VWC) time series over a dry agri-environment in southern Argentina. MOGP models not only exploit the auto-correlations of S1 and S2 data streams independently but also the inter-channel cross-correlations. The S1 RVI and S2 VWC time series at the selected study sites being the inputs of the MOGP models proved to be closely correlated. Regarding the set of assessed models, the Convolutional Gaussian model (CONV) delivered noteworthy accurate data fusion results over winter wheat croplands belonging to the 2020 and 2021 campaigns (NRMSEwheat2020 = 16.1%; NRMSEwheat2021 = 10.1%). Posteriorly, we removed S2 observations from the S1 & S2 dataset corresponding to the complete phenological cycles of winter wheat from September to the end of December to simulate the presence of clouds in the scenes and applied the CONV model at the pixel level to reconstruct spatiotemporally-latent VWC maps. After applying the fusion strategy, the phenology of winter wheat was successfully recovered in the absence of optical data. Strong correlations were obtained between S2 VWC and S1 & S2 MOGP VWC reconstructed maps for the assessment dates (R2¯wheat−2020 = 0.95, R2¯wheat−2021 = 0.96). Altogether, the fusion of S1 SAR and S2 optical EO data streams with MOGP offers a powerful innovative approach for cropland trait monitoring over cloudy high-latitude regions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.