Next Article in Journal
Fully Convolutional Neural Network for Rapid Flood Segmentation in Synthetic Aperture Radar Imagery
Next Article in Special Issue
Retrieval of Crude Protein in Perennial Ryegrass Using Spectral Data at the Canopy Level
Previous Article in Journal
Open Source Riverscapes: Analyzing the Corridor of the Naryn River in Kyrgyzstan Based on Open Access Data
Previous Article in Special Issue
Remote Sensing of Grassland Production and Management—A Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Monitoring Pasture Aboveground Biomass and Canopy Height in an Integrated Crop–Livestock System Using Textural Information from PlanetScope Imagery

by
Aliny A. Dos Reis
1,2,*,
João P. S. Werner
2,
Bruna C. Silva
2,
Gleyce K. D. A. Figueiredo
2,
João F. G. Antunes
3,
Júlio C. D. M. Esquerdo
3,
Alexandre C. Coutinho
3,
Rubens A. C. Lamparelli
1,
Jansle V. Rocha
2 and
Paulo S. G. Magalhães
1
1
Interdisciplinary Center of Energy Planning – NIPE, University of Campinas – UNICAMP, Campinas 13083-896, SP, Brazil
2
School of Agricultural Engineering – FEAGRI, University of Campinas – UNICAMP, Campinas 13083-875, SP, Brazil
3
Embrapa Agricultural Informatics, Brazilian Agricultural Research Corporation – Embrapa, Campinas 13083-886, SP, Brazil
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(16), 2534; https://doi.org/10.3390/rs12162534
Submission received: 25 June 2020 / Revised: 31 July 2020 / Accepted: 4 August 2020 / Published: 6 August 2020
(This article belongs to the Special Issue Advances of Remote Sensing in Pasture Management)

Abstract

:
Fast and accurate quantification of the available pasture biomass is essential to support grazing management decisions in intensively managed fields. The increasing temporal and spatial resolutions offered by the new generation of orbital platforms, such as Planet CubeSat satellites, have improved the capability of monitoring pasture biomass using remotely sensed data. Here, we assessed the feasibility of using spectral and textural information derived from PlanetScope imagery for estimating pasture aboveground biomass (AGB) and canopy height (CH) in intensively managed fields and the potential for enhanced accuracy by applying the extreme gradient boosting (XGBoost) algorithm. Our results demonstrated that the texture measures enhanced AGB and CH estimations compared to the performance obtained using only spectral bands or vegetation indices. The best results were found by employing the XGBoost models based only on texture measures. These models achieved moderately high accuracy to predict pasture AGB and CH, explaining 65% and 89% of AGB (root mean square error (RMSE) = 26.52%) and CH (RMSE = 20.94%) variability, respectively. This study demonstrated the potential of using texture measures to improve the prediction accuracy of AGB and CH models based on high spatiotemporal resolution PlanetScope data in intensively managed mixed pastures.

Graphical Abstract

1. Introduction

Grazing pasture and croplands occupy a significant portion of land surface in the world. In the last decades, many countries, such as Australia [1,2], France [3], United States of America [4,5], and Brazil [6,7,8], have adopted the integration of crop and livestock systems as an alternative for sustainable production intensification. Integrated Crop–Livestock Systems (ICLS) seek the synergism between agricultural and livestock resources in combination with land use and management practices focused on the conservation of natural resources [9].
Adequate grazing management is of critical importance to ensure greater forage production to the livestock component of ICLS. Thus, understanding the spatiotemporal dynamics of forage resources in grazed areas is fundamental to support grazing management decisions [10], especially in intensively managed fields. Pasture monitoring at a fine scale, based on field measurements, is time-consuming and often spatially limited—based on sampling design distribution and intensity—and is unlikely to give representative information of large pasture areas [11].
An alternative approach for pasture monitoring is the use of remotely sensed data [12]. Previous studies have reported good agreement between field measurements of pasture biomass and information extracted from satellite data [13,14,15]. However, the temporal frequency of satellite data and the spatial resolution needed to capture the forage production variation between or within paddocks have been an obstacle to achieve effective pasture monitoring using the commonly used satellite data, such as Terra/Aqua/MODIS (Moderate Resolution Imaging Spectroradiometer) (250 m near daily since 2000) [16,17] and Landsat 5/TM, 7/ETM+, 8/OLI (30 m every 16 days from 1984) [18,19,20].
The new generation of orbital platforms, so-called constellations of nano-satellites, such as Planet CubeSats, offer an unprecedented combination of high temporal (daily) and high spatial (3 meters) resolution imagery that may overcome the spatiotemporal limitations offered by the available satellite data sources [21], as well as advance the field of pasture monitoring at a finer scale in intensively managed fields. In addition to the spectral information, texture measures derived from high spatial resolution optical imagery allow the incorporation of both spectral and spatial distribution of image grey values [22,23,24], enabling a finer distinction of structural detail within pasturelands. However, incorporating textural information to monitor intensively managed pasture fields is challenging, since texture varies widely depending on the landscape, types of measures, and associated parameters (e.g., window size, offset). To this date, only few studies, if not none, have explored the potential of using texture measures for monitoring pasture biomass.
The integration of spectral and textural information derived from remotely sensed data results in high-dimensional and complex datasets, highlighting the need for robust modelling approaches. Machine learning (ML) algorithms are powerful tools to cope with this kind of high-dimensional and complex data [25,26] and have been increasingly used for a wide range of tasks, including pasture monitoring [15,27,28]. The extreme gradient boosting (XGBoost) algorithm, a novel implementation of gradient boosting decision trees [29], has demonstrated excellent performance in many applications due to its high efficiency and impressive accuracy [30,31,32].
In this study, we assessed the feasibility of using spectral and textural information derived from high spatiotemporal resolution PlanetScope imagery for estimating and monitoring aboveground biomass (AGB) and canopy height (CH) of intensively managed mixed pastures in an ICLS in the western region of São Paulo State, Brazil. Additionally, we assessed the potential for enhanced estimation accuracy by applying the XGBoost algorithm compared with the well-known ML algorithm random forest (RF).

2. Materials and Methods

2.1. Study Area

To evaluate the performance of PlanetScope imagery for estimating and monitoring AGB and CH of intensively managed pasturelands, four fields of approximately 50 hectares each were selected in the western region of São Paulo State, Brazil (Figure 1). The four fields have trees for shade and are split into 13 paddocks on which grazing livestock (cattle) rotates between them throughout the season. This area has been intensively managed as an ICLS based on the rotation of cultivated pasture during the winter season (usually between April and October) and soybean cultivation in the summer season (usually between November and March), with pasture being the focus of this study.
The investigated pasture is composed of a consortium of ruzi grass (Urochloa ruziziensis) and millet (Pennisetum glaucum), sown at a proportion of 15 kg ha−1 of millet and 5 kg ha−1 of ruzi grass in a spacing of 0.17 m between rows. Due to its agronomic characteristics of rapid growth, high drought resistance, and high dry-matter production, the consortium of millet with tropical forages has been used as a management strategy to increase biomass production during the winter season and to decrease the time interval between sowing and the first grazing events in our study area. Pasture sowing began on 28 March, after soybean harvest, and lasted until 6 April 2019.
The study area has tropical climatic conditions, corresponding to Köppen’s climatic type Aw, with drier months during the winter (i.e., June–August) [33]. The mean annual rainfall varies between 1200 mm and 1400 mm, concentrated in the months of December and January. Mean daily temperatures during the coldest month typically exceed 18 °C [34]. Elevations range between 310 m and 370 m above mean sea level. Soils are predominately sandy loam Oxisol, originated from sandstones of the Bauru group, with clay contents from 22 to 241 g kg−1.
During the forage growing season, the expression of the productive potential of tropical pastures is determined by environmental factors, mainly temperature and water availability [35]. The meteorological records for our study area (Figure 2) during the forage growing season of 2019 are based on daily precipitation and temperature records from the fifth generation ECMWF (European Centre for Medium-Range Weather Forecasts) reanalysis (ERA5) data of the global climate [36].

2.2. Field Data Collection

Pasture AGB and CH data were collected during the growing season (from May to November of 2019) using a sampling grid with one hundred georeferenced points (Figure 3), defined to maximize the cost–benefit ratio between sampling efforts (in terms of time and financial resources) and monitoring results to obtain an acceptable prediction accuracies in the modelling of AGB and CH [37]. The sampling points were distributed within the study area with a sampling intensity of 25 points per field using a stratified systematic unaligned sampling design. Field data collection was conducted at six dates: 17 May, 25 May, 18 June, 14 July, 12 August, and 02 November 2019. The field campaigns of May (17 May and 25 May) occurred before the animal’s entrance in the fields. The following field campaign dates (18 June, 14 July, 12 August, and 02 November 2019) were defined in order to capture different phases of pasture growth (millet and ruzi grass) and biomass availability, as well as in function of the entry and exit of animals in the paddocks. Although we intended to collect field data in all one hundred sampling points in every month through the forage growing season, it was not possible due to the presence of animals in part of the study area in some months and in the entire study area in September and October (Table 1).
The rotational grazing system, as well as the dates of entrance and exit of the animals in the paddocks, were defined by the farm manager. The first grazing cycle happened between the months of May and June, when the fields were predominantly covered by millet. In July, the fields were without animals, allowing the growth of forage, especially ruzi grass. However, the ruzi grass had a low biomass accumulation in July as a response to plant water stress. Later, in August, when the ruzi grass reached an adequate level of coverage, the animals were allocated in the fields in a lower stocking rate and stayed there until the beginning of November.
Pasture AGB was measured by destructive sampling from 1 m2 frames (Figure 3b). The fresh biomass was weighted in the field using a hanging scale, and then the ruzi grass and millet were separated and weighted again. To determine dry mass (g m−2), the fresh biomass of ruzi grass and millet were dried at 65 °C in the laboratory for 72 h. Next, they were weighed to obtain the total AGB per sampling point and field campaign. Simultaneously with AGB data collection, CH data were measured using a sward stick in 11 representative locations within a buffer of 5 m (Figure 3c) in which the centroid was the position of each sampling point. To obtain the mean CH, we calculated a weighted mean height based on the proportion of millet: ruzi grass in each sampling point. The proportion of millet at a specific sampling point was determined by dividing the weight of millet dry mass by the total AGB weight in that sampling point. The same procedure was adopted to determine the proportion of ruzi grass in each sampling point: the weight of ruzi grass dry mass was divided by the weight of total AGB in that sampling point. The field-based descriptive statistics for the pasture under study are summarized in Table 1.

2.3. Remote Sensing Data Collection and Preprocessing

PlanetScope CubeSat multispectral images were acquired for this study. The selected cloud-free scenes covering the study area were acquired on dates that most closely coincided with the field campaign dates (i.e., 20 May 2019, 27 May 2019, 17 June 2019, 11 July 2019, 10 August 2019, and 02 November 2019, respectively) and were downloaded from the Brazilian Planet Labs commercial representative platform. PlanetScope is a satellite constellation comprising over 130 CubeSats 3U form factor (0.10 m by 0.10 m by 0.30 m), called “Doves”, which have the capability to image all of the Earth’s land surface on a daily basis. The PlanetScope satellites have four spectral bands: blue (B: 455–515 nm), green (G: 500–590 nm), red (R: 590–670 nm), and near infrared (NIR: 780–860 nm) with a spatial resolution of ∼3m (nadir ground sampling distance) with an equatorial overpass time of 9:30–11:30 a.m. (local solar time) [21].
We used the Planet Surface Reflectance (SR) Product, which is derived from the standard Planet Analytic Product (Radiance) and processed to top of atmosphere (TOA) reflectance and then atmospherically corrected to surface reflectance using coefficients supplied with the Planet Radiance product and the 6SV2.1 radiative transfer code [38].

2.4. Vegetation Indices

For each PlanetScope image, twenty-four vegetation indices (VIs) that only include visible and NIR spectral bands on their formulation were derived (Table 2). These indices have been used in agricultural studies and they have shown potential to detect certain vegetation characteristics that could be crucial for pasture AGB and CH estimation.

2.5. Texture Measures

In order to explore the potential of PlanetScope-derived textural information in the prediction of pasture AGB and CH, we used the grey level co-occurrence matrix (GLCM) statistical texture approach to derive texture images [58]. Texture measures are commonly used to quantity the spatial distribution of cell greyscale values in an image, which can provide useful information for AGB and CH in mixed pastures. In our study area, the variation in cell greyscale values can be a result of changes in the pasture canopy structure occasioned by variations in the proportion of millet and ruzi grass, grazing intensity, and pasture coverage. In addition, the presence of terraces (a soil conservation technology adopted by the farmer—Figure 1) in the pasture fields will affect the texture measures obtained using different window sizes and offsets.
The GLCM uses a spatially dependent grey level matrix to compute texture measures from which the spatial arrangements of image colors or intensities are represented. Eight second-order GLCM texture measures, including mean (MEA), variance (VAR), homogeneity (HOM), contrast (CON), dissimilarity (DIS), entropy (ENT), second moment (2M), and correlation (COR) were calculated using the ENVI/IDL software (Harris Geospatial Solutions, Inc., Broomfield, CO, USA). MEA is the grey tone average in the GLCM window. VAR is a measure of dispersion of the grey tone values from its mean. High VAR suggests large standard deviation of grey tone in the local region. HOM is a measure of homogenous pixel values across an image, and high values suggest small grey tone differences between neighboring pixels. CON is a measurement of variability and high values mean large local variations or the opposite of HOM. DIS is inversely related to HOM and similar to CON, and high values suggest local region with high variability in an image. ENT is a measurement of disorder degree in an image, and high values mean a heterogeneous texture or the opposite of 2M. 2M, or angular second moment, is a measure of textural uniformity and is high for locally homogenous regions — similar to HOM. Finally, COR is a measurement of grey tone linear dependencies between a pair of pixels in an image (Figure 4 illustrates the eight second-order GLCM texture measures derived from a subset of a NIR PlanetScope band from May 2019).
To determine the optimal window size for our study, the eight texture measures were calculated for the G, R, and NIR bands of PlanetScope images using three window sizes: 3 × 3, 5 × 5, and 7 × 7 pixels. For each window size, texture measures were also calculated at four offsets (θ), 0°, 45°, 90°, and 135°, represented in Cartesian coordinates as [0, 1], [−1, 1], [−1, 0], and [−1, −1], respectively. As a result, a total of 288 texture features were generated for each PlanetScope image. All GLCMs were constructed using a 64 grey level quantization. PlanetScope blue band was not used to derive GLCM textures, because this band is strongly influenced by atmospheric scattering [59].

2.6. Spectral and Textural Data Extraction

From the coordinates of the sampling points, we generated a window of 3 × 3 pixels to extract the spectral and textural data of the PlanetScope images. In each window, the extracted pixel values in each of the predictor variables (spectral bands, vegetation indices, and texture measures) were averaged and then associated with the field-based measures of pasture AGB and CH for model development. By using this approach, we expected to reduce the uncertainty of positional discrepancy between the image and field data, as well as to take into account errors associated with GNSS single-point positioning.

2.7. Pasture AGB and CH Modelling

To explore the potential of using spectral and textural information derived from PlanetScope imagery to estimate pasture AGB and CH, we designed four prediction scenarios (SC) using both the XGBoost and RF machine learning algorithms: the first three scenarios used, respectively, spectral bands (G, R and NIR) (SC1), vegetation indices (SC2), and the optimum textures measures (SC3) individually to estimate AGB and CH, whilst the fourth scenario (SC4) investigated the use of combined spectral bands (SC1), vegetation indices (SC2), and the optimum texture measures (SC3). To determine the optimum textures measures in SC3, we assessed the effect of window size and offset texture parameters in the accuracy of the AGB and CH models using both the XGBoost and RF algorithms, and the optimum parameter set (window size and offset) was chosen for each target variable (AGB and CH).
In order to develop single AGB and CH models, which are able to encompass the complexity and variability of mixed pastures throughout the forage growing season and are dependent only on remotely sensed variables, we used the complete dataset (considering data from all field campaigns and image dates) to train and test the XGBoost and RF models. The dataset was divided randomly into 70% for training and 30% for testing of the models. All AGB and CH modelling analyses and evaluations were performed using the mlr package in R software [60].

Machine Learning Regression Algorithms

The XGBoost algorithm proposed by Chen and Guestrin [29] is an advanced implementation of the gradient boosting framework of decision trees. This algorithm creates a number of decision trees sequentially based on the idea of “boosting”, which combines all the predictions of a set of weak learners for develo** a strong learner through additive training strategies. XGBoost has showed superiority over other machine learning algorithms and achieved outstanding performances in many research areas [32,61,62,63].
The RF algorithm is an ensemble of decision trees based on the bagging technique, first introduced by Breiman [64]. For regression problems, this algorithm grows a large number of decision trees (forest), and the final prediction value corresponds to the averaged output of all individual decision trees. Each tree in the forest is independently constructed during the training process using a bootstrap sample (sample with replacement) of the training data. RF is a well-known regression method with high accuracy and robustness that provides fast, flexible, robust, and accurate predictive capabilities.

2.8. Model Evaluation

2.8.1. Hyperparameters Tuning in XGBoost and RF models

Each machine learning algorithm requires a set of hyperparameters that need to be tuned properly during the model development to obtain the best performance. Optimal values of hyperparameters for each XGBoost and RF model were selected according to the accuracy estimation in the training dataset using the 5-fold cross-validation method. We employed the random search method for tuning the hyperparameters [65] for each XGBoost and RF model using the candidate value ranges presented in Table 3. This method repeatedly trains the models, and each run is based on a random sample from all combinations of hyperparameter values.

2.8.2. Feature Importance

The relative importance of the predictor variables for each XGBoost and RF model were calculated based on the built-in feature importance measures of both algorithms, enabling the most important variables in each model run to be interpreted. In the RF models, the variables are ranked based on the increase of the mean square error (IncMSE) when a variable is randomly permuted [64]. In the XGBoost models, the variables are ranked based on information gain. Information gain is the improvement in accuracy brought by a predictor variable to the branches it is on. Each gain of each predictor variable in all trees is taken into account. At the end, the information gain is averaged per predictor variable to give a vision of the entire model [29].
Although machine learning algorithms may excel at handling settings with several predictor variables, feature selection may improve the performance of the algorithms, either in terms of learning speed, generalization capability, as well as interpretability of the resulted model [66]. In our study, we tested whether the feature selection would further improve the performance of the XGBoost and RF models. We used the regressional ReliefF (RReliefF) algorithm to select the optimal subset of predictor variables [67]. RReliefF is a classic filter-based method and a modification of the ReliefF algorithm, suitable for regression tasks with continuous output values. The performance of the RF and XGBoost models considering all predictor variables and the selected variables were compared, and the best RF and XGBoost models with the highest prediction accuracy were selected per prediction scenario.

2.8.3. Accuracy Assessment and Uncertainty Analysis

To evaluate and compare the XGBoost and RF models’ performance, we employed the root mean square error (RMSE) in absolute and percentage terms, and the coefficient of determination (R2), calculated based on field-based pasture AGB and CH measurements in the testing dataset, i.e., data not used in the model development process. Then, the best performing models were used to estimate AGB and CH for our entire study area. In addition, we used a tree mask to avoid prediction errors in its location (Figure 1).
In our study, the uncertainty analysis for AGB and CH estimation was carried out following the procedure adopted by Csillik et al. [68,69]. We grouped the estimated values of AGB and CH (separately) into 10 bins using the natural breaks method and computed the RMSE in percentage (%) for each bin. Next, we fitted a regression model using the 10 RMSE bin values and the estimated mean values of AGB and CH per bin. The fitted regression models were used to create spatially explicit uncertainty maps for each estimated map of AGB and CH.

3. Results

The window size and offset (orientation) texture parameters influenced the accuracy of AGB and CH estimation using both RF and XGBoost machine learning algorithms (Table 4).
In general, the XGBoost models resulted in the highest prediction accuracies. For AGB estimation, the highest R2 (0.65) and the lowest RMSE (26.52%) were achieved using a 7 × 7 window size and the 90° offset. On the other hand, the 5 × 5 window size and the 135° offset resulted in the highest R2 (0.89) and the lowest RMSE (20.94%) for CH estimation.
Our results show that the XGBoost models achieved slightly superior performance in predicting AGB and CH than the RF models in all prediction scenarios tested in this study (Table 5). The spectral band models (SC1) for AGB and CH estimation have less predictive ability than the other tested prediction scenarios (SC2, SC3, and SC4). The optimum texture measures (SC3) showed a considerable improvement in the accuracy of AGB (an increase of up to 23% in the R2 values) and CH (an increase of around 3% in the R2 values) estimation when compared with using only spectral bands (SC1) or vegetation indices (SC2). However, the combination of the spectral bands, the vegetation indices, and the optimum texture measures (SC4) did not improve the AGB and CH estimation accuracies when compared to those obtained using only the optimum texture measures.
The predictions of AGB and CH, retrieved for the best XGBoost models using the optimum texture measures, showed a good agreement with the field-based measurements in the testing dataset (Figure 5). The AGB (Figure 5a) and CH (Figure 5b) estimates showed strong linear relationships between predicted values and field-based measurements (Pearson’s correlation (r) = 0.80 and r = 0.94, respectively). In general, the lowest values of AGB were overestimated, while the highest observed AGB values were underestimated. In spite of the high agreement between predicted and measured values of CH, we observed a small trend of underestimation of CH higher than 0.9 m.
Among the optimum texture measures, the MEA textures obtained from the G, R, and NIR spectral bands were the most important variables in the prediction of AGB and CH, as indicated by the relative importance metric of the best XGBoost models (Figure 6). The CON, VAR, COR, and ENT textures of NIR and R bands were among the subsequent most important variables for the AGB and CH predictions. On the other hand, DIS and HOM textures contributed little to the AGB and CH models.
The spatiotemporal variations in pasture AGB and CH through the forage growing season are shown in the maps obtained with the best-performing XGBoost models (Figure 7 and Figure 8) using the optimum texture measures extracted from the PlanetScope spectral bands (SC3). Predicted AGB varied from 51.07 to 463.89 g m−2. For CH, the predicted values ranged from 0.16 to 1.06 m. The spatiotemporal changes in pasture AGB and CH for all paddocks during the growing season were in parity with expected changes in pasture vegetation, as driven by forage development, management operations, and weather conditions. The maximum values of pasture AGB and CH were measured in the fields in the month of May, mainly due to the high proportion of millet in the pasture vegetation (Table 1). Correspondingly, the highest predicted values of AGB and CH (Figure 7 and Figure 8) were also observed for the month of May. In June, after the first grazing cycle in some paddocks, we observed a decrease in both predicted and measured pasture AGB and CH values. July was the month with the lowest predicted and measured AGB values due mainly to grazing intensity and meteorological drought stress (Figure 2). November showed the lowest predicted and measured CH values due to the full ruzi grass establishment (Table 1).
To create spatially explicit uncertainty maps for each estimated map of AGB and CH, we fitted polynomial functions (Figure 9) for the relative RMSE uncertainties in our AGB and CH estimations. The uncertainties in AGB estimations (Figure 9a) were under 20% RMSE for lower predicted values of AGB (< 90 g m−2) and for predicted values of AGB between 230 and 370 g m−2. AGB predictions between 110 and 205 g m−2 resulted in uncertainties equal or higher than 25% RMSE, and the uncertainty increased for predicted values higher than 370 g m−2. Predicted values of CH (Figure 9b) between 0.26 and 0.52 m and higher than 1.01 m resulted in uncertainties higher than 25% RMSE. The uncertainties in CH estimations were under 15% RMSE for predicted values of CH lower than 0.18 m and ranging from 0.67 to 0.93 m.
The predicted AGB (Figure 10) and CH (Figure 11) uncertainty maps can be useful to be compared with the maps obtained by the best XGBoost models (Figure 7 and Figure 8), providing additional information about prediction reliability at pixel level and for specific areas within the pasture fields.

4. Discussion

In this study, spectral and textural information derived from PlanetScope imagery were explored to estimate pasture AGB and CH in intensively managed fields using the XGBoost and RF machine learning algorithms. Although the differences between the AGB and CH prediction accuracies obtained by the two algorithms are small, the XGBoost models outperformed most of the RF models in the prediction scenarios analyzed (Table 4 and Table 5). The superior performance of XGBoost when compared with other well-known machine learning algorithms in terms of both accuracy and computational cost has also been observed in other remote sensing studies [30,31,32,61]. Our results highlight the potential of using GLCM-based texture measures (SC3) to achieve enhanced AGB (RMSE = 26.52%; R2 = 0.65) and CH (RMSE = 20.94%; R2 = 0.89) prediction accuracies when compared to the use of spectral bands (SC1) or vegetation indices (SC3) as predictor variables (Table 5). The integration of texture measures with spectral bands and vegetation indices (SC4) did not result in improved AGB and CH prediction accuracies.
Texture measures derived from high-resolution satellite data enable a finer distinction of vegetation structural details [70]. For this reason, texture measures derived from optical remotely sensed data have been used to improve the prediction of vegetation structural parameters in forests [22,70,71,72,73] and croplands [74,75]. In our study area, the texture measures were able to capture the differences in AGB and CH in the pasture fields as a result of grazing intensity, proportion of millet and ruzi grass, plant stress response, and pasture coverage. The MEA texture was more important than the other texture measures in the prediction of AGB and CH, as measured by the feature importance metric of the XGBoost algorithm (Figure 6). This texture measure corresponds to the average values of grey tones or intensities in the G, R, and NIR spectral bands within the moving window that may contain information about the pasture canopy structure and the background. By averaging the grey tones values, the MEA texture smooths the image and minimizes the interference of the background [76]. The subsequent most important variables for AGB and CH predictions were the CON, VAR, COR, and ENT textures of NIR and R spectral bands. These texture measures are directly related to high variations in the grey tones’ values caused mainly by the variations in the leaf shapes through the forage growing season. Leaves are the main component of canopy responsible for the overall texture appearance. As a result, the variations in the intertwined characteristics of millet and ruzi grass through the growing season, such as leaf shape, leaf color, leaf angle, leaf size, and density, were potentially represented by the GLCM-based texture measures.
The better performance of the CH model (R² = 0.89) compared to the performance of the AGB model (R² = 0.65) may be explained by the geometric properties of these two variables (AGB and CH) and variations in the pasture canopy structure through the growing season (Table 1). AGB is a three- dimensional variable. Hence, the capacity of the texture measures derived from PlanetScope imagery to provide accurate estimates of AGB depends on their ability to discriminate both the horizontal (e.g., canopy density/coverage) and the vertical (e.g., canopy height) pasture structure [77]. However, in our study area, the pasture canopy structure is directly affected by the proportion of millet and ruzi grass through the growing season, as well as grazing intensity, which resulted in greater variation in the spatiotemporal pattern of canopy reflectance.
Millet and ruzi grass are two forages with different structural properties, growth rates, and biomass production. Millet is a tall, robust, erect, annual bunchgrass with long narrow leaves and a rapid initial growth rate, whereas ruzi grass is a cree** perennial that has short rhizomes, which form a dense leafy cover over the ground. The size, shape, orientation, and roughness of individual parts (e.g., the leaves, stems, and inflorescence) of millet and ruzi grass and their variation throughout the forage growing season will affect the reflectance and, consequently, the texture measures derived from remotely sensed imagery. The dominance of either millet or ruzi grass in the pasture canopy will result in smaller differences in grey tone values between neighboring pixels than when there is a greater or less proportion of one or the other forage in the pasture canopy. In the beginning of the pasture-growing season, the high proportion of millet in the pasture canopy was responsible for the highest values of AGB and CH (Table 1, Figure 7, and Figure 8). After the first grazing cycle, the decrease in the proportion of millet favored the growth of ruzi grass and resulted in high variability in the grey tones on the images. Later in the growing season, the proportion of millet in the pasture canopy was lower than 45% in June and 15% in July. In these months, all paddocks had lower values of AGB mainly due to grazing and plant water stress, a meteorological drought characteristic in this region during the winter months (total precipitation in June–July lower than 50 mm (Figure 2)). The lowest values of CH were observed in November, when the pasture canopy was composed only of ruzi grass. However, with the beginning of the rainy season and full ruzi grass establishment, we observed a significant increase in AGB production in November (Figure 7 and Figure 8). It thus seems that the texture measures derived from PlanetScope imagery were more efficient to capture variations in CH than in AGB in our study area.
The Planet’s constellation of CubeSats nano-satellites offers an unprecedented opportunity to monitor vegetation dynamics with enhanced spatial detail more frequently than ever before [78,79]. The temporal and spatial resolutions of the commonly applied optical remotely sensed data, especially medium and coarse spatial resolution remotely sensed imagery, have been an obstacle to achieve effective pasture monitoring at a fine scale in intensively managed fields [15,18,80]. The increasing temporal and spatial resolution offered by Planet CubeSat satellites may overcome this spatiotemporal limitation. However, while the Planet CubeSat satellites offer an unprecedented combination of high temporal and spatial resolution imagery necessary to monitor rapid changes in pasture canopy structure, these satellites have the disadvantage of not being equivalent to a rigorously calibrated and high-performing satellite, such as Landsat [81]. Cross-sensor variations in images obtained from different nano-satellites may affect the generalization of models produced based on the relationship between field-based measurements and variables derived from PlanetScope imagery. Future predictions on new areas or dates will be limited to images with a matching histogram to the reference training images. Another possible limitation is related to data accessibility. Planet Labs is a commercial company, and therefore, PlanetScope imagery are not open access as NASA’s frequently used Landsat imagery.
Finally, the approach used in this study provide a framework for integrating field data and textural information derived from high-resolution remotely sensed imagery to monitor AGB and CH in intensively managed mixed pasture fields. In addition, the concepts presented in this study are expected to be consistent regardless of the sensor, provided that the spatial resolution of the remotely sensed imagery is able to capture the variability of the pasture fields.

5. Conclusions

In this study, we showed that the GLCM-based texture measures derived from PlanetScope imagery enhanced the prediction accuracy of AGB and CH models compared to the performance obtained using spectral bands or vegetation indices. The extreme gradient boosting (XGBoost) algorithm outperformed the well-known machine learning (ML) algorithm random forest (RF) in all prediction scenarios analyzed in our study.
Even though pasture canopy structure varies greatly through the growing season due to forage composition (proportion of millet and ruzi grass), grazing management, and environmental conditions, our models were able to predict the spatiotemporal changes in pasture AGB and CH with moderate (R2 = 0.65) to high (R2 = 0.89) prediction accuracies, respectively.
Despite possible limitations, this study demonstrated the potential of using texture measures to improve the prediction accuracy of AGB and CH models based on PlanetScope-derived data in intensively managed pasture fields at a fine scale.

Author Contributions

Conceptualization, A.A.D.R.; methodology, A.A.D.R. and B.C.S.; software, A.A.D.R., J.C.D.M.E., and J.F.G.A.; formal analysis, A.A.D.R.; writing—original draft preparation, A.A.D.R. and J.P.S.W.; writing—review and editing, A.A.D.R., J.P.S.W., B.C.S., G.K.D.A.F., J.F.G.A., J.C.D.M.E., A.C.C., R.A.C.L., J.V.R., and P.S.G.M.; supervision, J.F.G.A., G.K.D.A.F., R.A.C.L., J.V.R., and P.S.G.M.; funding acquisition, P.S.G.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by FAPESP (Process numbers 2017/50205-9 and 2018/24985-0) and in part by CAPES (Finance Code 001).

Acknowledgments

The authors would like to thank the owner, manager, and staff of the Campina Farm (CV Nelore Mocho) for their support and assistance. We are also grateful to the undergraduate and graduate students, postdoctoral researchers, and technicians for hel** with the field data collection and preparation, as well as NIPE and FEAGRI/UNICAMP for the infrastructure support provided for this project’s development.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Nie, Z.; McLean, T.; Clough, A.; Tocker, J.; Christy, B.; Harris, R.; Riffkin, P.; Clark, S.; McCaskill, M. Benefits, challenges and opportunities of integrated crop-livestock systems and their potential application in the high rainfall zone of southern Australia: A review. Agric. Ecosyst. Environ. 2016, 235, 17–31. [Google Scholar] [CrossRef]
  2. Bell, L.W.; Moore, A.D. Integrated crop-livestock systems in Australian agriculture: Trends, drivers and implications. Agric. Syst. 2012, 111, 1–12. [Google Scholar] [CrossRef]
  3. Moraine, M.; Duru, M.; Therond, O. A social-ecological framework for analyzing and designing integrated crop–livestock systems from farm to territory levels. Renew. Agric. Food Syst. 2017, 32, 43–56. [Google Scholar] [CrossRef] [Green Version]
  4. Garrett, R.D.; Niles, M.T.; Gil, J.D.B.; Gaudin, A.; Chaplin-Kramer, R.; Assmann, A.; Assmann, T.S.; Brewer, K.; de Faccio Carvalho, P.C.; Cortner, O.; et al. Social and ecological analysis of commercial integrated crop livestock systems: Current knowledge and remaining uncertainty. Agric. Syst. 2017, 155, 136–146. [Google Scholar] [CrossRef] [Green Version]
  5. Poffenbarger, H.; Artz, G.; Dahlke, G.; Edwards, W.; Hanna, M.; Russell, J.; Sellers, H.; Liebman, M. An economic analysis of integrated crop-livestock systems in Iowa, U.S.A. Agric. Syst. 2017, 157, 51–69. [Google Scholar] [CrossRef]
  6. Cortner, O.; Garrett, R.D.; Valentim, J.F.; Ferreira, J.; Niles, M.T.; Reis, J.; Gil, J. Perceptions of integrated crop-livestock systems for sustainable intensification in the Brazilian Amazon. Land Use Policy 2019, 82, 841–853. [Google Scholar] [CrossRef]
  7. De Souza Filho, W.; de Albuquerque Nunes, P.A.; Barro, R.S.; Kunrath, T.R.; de Almeida, G.M.; Genro, T.C.M.; Bayer, C.; de Faccio Carvalho, P.C. Mitigation of enteric methane emissions through pasture management in integrated crop-livestock systems: Trade-offs between animal performance and environmental impacts. J. Clean. Prod. 2019, 213, 968–975. [Google Scholar] [CrossRef]
  8. Bieluczyk, W.; de Cássia Piccolo, M.; Pereira, M.G.; de Moraes, M.T.; Soltangheisi, A.; de Campos Bernardi, A.C.; Pezzopane, J.R.M.; Oliveira, P.P.A.; Moreira, M.Z.; de Camargo, P.B.; et al. Integrated farming systems influence soil organic matter dynamics in southeastern Brazil. Geoderma 2020, 371, 114368. [Google Scholar] [CrossRef]
  9. Alves, B.J.R.; Madari, B.E.; Boddey, R.M. Integrated crop–Livestock–Forestry systems: Prospects for a sustainable agricultural intensification. Nutr. Cycl. Agroecosystems 2017, 108, 1–4. [Google Scholar] [CrossRef]
  10. Andersson, K.; Trotter, M.; Robson, A.; Schneider, D.; Frizell, L.; Saint, A.; Lamb, D.; Blore, C. Estimating pasture biomass with active optical sensors. Adv. Anim. Biosci. 2017, 8, 754–757. [Google Scholar] [CrossRef]
  11. Legg, M.; Bradley, S. Ultrasonic Arrays for Remote Sensing of Pasture Biomass. Remote Sens. 2019, 11, 2459. [Google Scholar] [CrossRef] [Green Version]
  12. Punalekar, S.M.; Verhoef, A.; Quaife, T.L.; Humphries, D.; Bermingham, L.; Reynolds, C.K. Application of Sentinel-2A data for pasture biomass monitoring using a physically based radiative transfer model. Remote Sens. Environ. 2018, 218, 207–220. [Google Scholar] [CrossRef]
  13. Edirisinghe, A.; Hill, M.J.; Donald, G.E.; Hyder, M. Quantitative map** of pasture biomass using satellite imagery. Int. J. Remote Sens. 2011, 32, 2699–2724. [Google Scholar] [CrossRef]
  14. Pullanagari, R.; Kereszturi, G.; Yule, I. Integrating airborne hyperspectral, topographic, and soil data for estimating pasture quality using recursive feature elimination with random forest regression. Remote Sens. 2018, 10, 1117. [Google Scholar] [CrossRef] [Green Version]
  15. Wang, J.; ** pasture biomass in mongolia using partial least squares, random forest regression and landsat 8 imagery. Int. J. Remote Sens. 2018, 1161, 3204–3226. [Google Scholar] [CrossRef]
  16. Parente, L.; Ferreira, L.; Faria, A.; Nogueira, S.; Araújo, F.; Teixeira, L.; Hagen, S. Monitoring the brazilian pasturelands: A new map** approach based on the landsat 8 spectral and temporal domains. Int. J. Appl. Earth Obs. Geoinf. 2017, 62, 135–143. [Google Scholar] [CrossRef]
  17. Jakimow, B.; Griffiths, P.; van der Linden, S.; Hostert, P. Map** pasture management in the Brazilian Amazon from dense Landsat time series. Remote Sens. Environ. 2018, 205, 453–468. [Google Scholar] [CrossRef]
  18. Planet Imagery Product Specifications. Available online: https://assets.planet.com/docs/Planet_Combined_Imagery_Product_Specs_letter_screen.pdf (accessed on 10 May 2020).
  19. Eckert, S. Improved forest biomass and carbon estimations using texture measures from worldView-2 satellite data. Remote Sens. 2012, 4, 810–829. [Google Scholar] [CrossRef] [Green Version]
  20. Wood, E.M.; Pidgeon, A.M.; Radeloff, V.C.; Keuler, N.S. Image texture as a remotely sensed measure of vegetation structure. Remote Sens. Environ. 2012, 121, 516–526. [Google Scholar] [CrossRef]
  21. Nichol, J.E.; Sarker, M.L.R. Improved Biomass Estimation Using the Texture Parameters of Two High-Resolution Optical Sensors. IEEE Trans. Geosci. Remote Sens. 2011, 49, 930–948. [Google Scholar] [CrossRef] [Green Version]
  22. Holloway, J.; Mengersen, K. Statistical machine learning methods and remote sensing for sustainable development goals: A review. Remote Sens. 2018, 10, 1365. [Google Scholar] [CrossRef] [Green Version]
  23. Weiss, M.; Jacob, F.; Duveiller, G. Remote sensing for agricultural applications: A meta-review. Remote Sens. Environ. 2020, 236, 111402. [Google Scholar] [CrossRef]
  24. Parente, L.; Mesquita, V.; Miziara, F.; Baumann, L.; Ferreira, L. Assessing the pasturelands and livestock dynamics in Brazil, from 1985 to 2017: A novel approach based on high spatial resolution imagery and Google Earth Engine cloud computing. Remote Sens. Environ. 2019, 232, 111301. [Google Scholar] [CrossRef]
  25. Liu, H.; Dahlgren, R.A.; Larsen, R.E.; Devine, S.M.; Roche, L.M.; Geen, A.T.O.; Wong, A.J.Y.; Covello, S.; **, Y. Estimating rangeland forage production using remote sensing data from a small unmanned aerial system (sUAS) and planetscope satellite. Remote Sens. 2019, 11, 595. [Google Scholar] [CrossRef] [Green Version]
  26. Chen, T.; Guestrin, C. XGBoost: A scalable tree boosting system. In Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, New York, NY, USA, 13–17 August 2016; pp. 785–794. [Google Scholar]
  27. Zhang, H.; Eziz, A.; ** using eXtreme gradient boosting based on extensive features. Remote Sens. 2019, 11, 1505. [Google Scholar]
  28. Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
  29. Louhaichi, M.; Borman, M.M.; Johnson, D.E. Spatially Located Platform and Aerial Photography for Documentation of Grazing Impacts on Wheat. Geocarto Int. 2001, 16, 65–70. [Google Scholar] [CrossRef]
  30. Kanemasu, E.T. Seasonal canopy reflectance patterns of wheat, sorghum, and soybean. Remote Sens. Environ. 1974, 3, 43–47. [Google Scholar] [CrossRef]
  31. Crippen, R. Calculating the vegetation index faster. Remote Sens. Environ. 1990, 34, 71–73. [Google Scholar] [CrossRef]
  32. Qi, J.; Chehbouni, A.; Huete, A.R.; Kerr, Y.H.; Sorooshian, S. A modified soil adjusted vegetation index. Remote Sens. Environ. 1994, 48, 119–126. [Google Scholar] [CrossRef]
  33. Rouse, J.; Haas, R.; Schell, J.; Deering, D.; Harlan, J. Monitoring vegetation systems in the Great Plains with ERTS. In Proceedings of the Third Earth Resources Technology Satellite-1 Symposium, Washington, DC, USA, 10–14 December 1973; pp. 309–317. [Google Scholar]
  34. Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  35. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  36. Richardson, A.J.; Wiegand, C.L. Distinguishing vegetation from soil background information. Photogramm. Eng. Remote Sensing 1977, 43, 1541–1552. [Google Scholar]
  37. Huete, A. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  38. Jordan, C.F. Derivation of Leaf-Area Index from Quality of Light on the Forest Floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
  39. Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef] [Green Version]
  40. Haralick, R.M.; Shanmugam, K.; Dinstein, I. Textural Features for Image Classification. IEEE Trans. Syst. Man. Cybern. 1973, SMC-3, 610–621. [Google Scholar] [CrossRef] [Green Version]
  41. Jensen, J.R. Remote Sensing of the Environment: An Earth Resource Perspective, 2nd ed.; Prentice Hall: Upper Saddle River, NJ, USA, 2007; ISBN 978-0131889507. [Google Scholar]
  42. Bischl, B.; Lang, M.; Kotthoff, L.; Schiffner, J.; Richter, J.; Studerus, E.; Casalicchio, G.; Jones, Z.M. Mlr: Machine learning in R. J. Mach. Learn. Res. 2016, 17, 1–5. [Google Scholar]
  43. Pham, T.D.; Yokoya, N.; **a, J.; Ha, N.T.; Le, N.N.; Nguyen, T.T.T.; Dao, T.H.; Vu, T.T.P.; Pham, T.D.; Takeuchi, W. Comparison of Machine Learning Methods for Estimating Mangrove Above-Ground Biomass Using Multiple Source Remote Sensing Data in the Red River Delta Biosphere Reserve, Vietnam. Remote Sens. 2020, 12, 1334. [Google Scholar] [CrossRef] [Green Version]
  44. Fan, J.; Wang, X.; Wu, L.; Zhou, H.; Zhang, F.; Yu, X.; Lu, X.; **ang, Y. Comparison of Support Vector Machine and Extreme Gradient Boosting for predicting daily global solar radiation using temperature and precipitation in humid subtropical climates: A case study in China. Energy Convers. Manag. 2018, 164, 102–111. [Google Scholar] [CrossRef]
  45. Li, Y.; Li, C.; Li, M.; Liu, Z. Influence of variable selection and forest type on forest aboveground biomass estimation using machine learning algorithms. Forests 2019, 10, 1073. [Google Scholar] [CrossRef] [Green Version]
  46. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
  47. Bergstra, J.; Bengio, Y. Random search for hyper-parameter optimization. J. Mach. Learn. Res. 2012, 13, 281–305. [Google Scholar]
  48. Chandrashekar, G.; Sahin, F. A survey on feature selection methods. Comput. Electr. Eng. 2014, 40, 16–28. [Google Scholar] [CrossRef]
  49. Kononenko, I.; Robnik-Šikonja, M.; Pompe, U. ReliefF for estimation and discretization of attributes in classification, regression, and ILP problems. Artif. Intell. Methodol. Syst. Appl. 1996, 18, 1–15. [Google Scholar]
  50. Csillik, O.; Kumar, P.; Asner, G.P. Challenges in Estimating Tropical Forest Canopy Height from Planet Dove Imagery. Remote Sens. 2020, 12, 1160. [Google Scholar] [CrossRef] [Green Version]
  51. Csillik, O.; Kumar, P.; Mascaro, J.; O’Shea, T.; Asner, G.P. Monitoring tropical forest carbon stocks and emissions using Planet satellite data. Sci. Rep. 2019, 9, 1–12. [Google Scholar] [CrossRef] [Green Version]
  52. Solórzano, J.V. Contrasting the potential of Fourier transformed ordination and gray level co-occurrence matrix textures to model a tropical swamp forest’s structural and diversity attributes. J. Appl. Remote Sens. 2018, 12, 1. [Google Scholar] [CrossRef]
  53. Sarker, L.R.; Nichol, J.E. Improved forest biomass estimates using ALOS AVNIR-2 texture indices. Remote Sens. Environ. 2011, 115, 968–977. [Google Scholar] [CrossRef]
  54. Ozdemir, I.; Karnieli, A. Predicting forest structural parameters using the image texture derived from worldview-2 multispectral imagery in a dryland forest, Israel. Int. J. Appl. Earth Obs. Geoinf. 2011, 13, 701–710. [Google Scholar] [CrossRef]
  55. Kayitakire, F.; Hamel, C.; Defourny, P. Retrieving forest structure variables based on image texture analysis and IKONOS-2 imagery. Remote Sens. Environ. 2006, 102, 390–401. [Google Scholar] [CrossRef]
  56. Yuan, W.; Wijewardane, N.K.; Jenkins, S.; Bai, G.; Ge, Y.; Graef, G.L. Early Prediction of Soybean Traits through Color and Texture Features of Canopy RGB Imagery. Sci. Rep. 2019, 9, 14089. [Google Scholar] [CrossRef] [Green Version]
  57. Li, S.; Yuan, F.; Ata-UI-Karim, S.T.; Zheng, H.; Cheng, T.; Liu, X.; Tian, Y.; Zhu, Y.; Cao, W.; Cao, Q. Combining Color Indices and Textures of UAV-Based Digital Imagery for Rice LAI Estimation. Remote Sens. 2019, 11, 1763. [Google Scholar] [CrossRef] [Green Version]
  58. Zheng, H.; Cheng, T.; Zhou, M.; Li, D.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Improved estimation of rice aboveground biomass combining textural and spectral analysis of UAV imagery. Precis. Agric. 2019, 20, 611–629. [Google Scholar] [CrossRef]
  59. Goel, N.S. Models of vegetation canopy reflectance and their use in estimation of biophysical parameters from reflectance data. Remote Sens. Rev. 1988, 4, 1–212. [Google Scholar] [CrossRef]
  60. Helman, D.; Bahat, I.; Netzer, Y.; Ben-Gal, A.; Alchanatis, V.; Peeters, A.; Cohen, Y. Using Time Series of High-Resolution Planet Satellite images to monitor grapevine stem water potential in commercial vineyards. Remote Sens. 2018, 10, 1615. [Google Scholar] [CrossRef] [Green Version]
  61. Miller, G.J.; Morris, J.T.; Wang, C. Estimating aboveground biomass and its spatial distribution in coastal wetlands utilizing planet multispectral imagery. Remote Sens. 2019, 11, 2020. [Google Scholar] [CrossRef] [Green Version]
  62. Zhang, B.; Zhang, L.; **e, D.; Yin, X.; Liu, C.; Liu, G. Application of synthetic NDVI time series blended from landsat and MODIS data for grassland biomass estimation. Remote Sens. 2016, 8, 10. [Google Scholar] [CrossRef] [Green Version]
  63. Houborg, R.; McCabe, M.F. A cubesat enabled spatio-temporal enhancement method (cestem) utilizing planet, landsat and MODIS data. Remote Sens. Environ. 2018, 209, 211–226. [Google Scholar] [CrossRef]
Figure 1. Location of the study area in the western region of São Paulo (SP) state, Brazil. The background image is a PlanetScope scene (true color composite red-green-blue (RGB):321) on 09 June 2019.
Figure 1. Location of the study area in the western region of São Paulo (SP) state, Brazil. The background image is a PlanetScope scene (true color composite red-green-blue (RGB):321) on 09 June 2019.
Remotesensing 12 02534 g001
Figure 2. Daily meteorological data obtained from the ECMWF (European Centre for Medium-Range Weather Forecasts)/Copernicus Climate Change Service, considering the period from April to November 2019 for the study area.
Figure 2. Daily meteorological data obtained from the ECMWF (European Centre for Medium-Range Weather Forecasts)/Copernicus Climate Change Service, considering the period from April to November 2019 for the study area.
Remotesensing 12 02534 g002
Figure 3. Spatial distribution of the sampling points within the study area using a stratified systematic unaligned sampling design (a) and field data collection: aboveground biomass (AGB) (b) and canopy height (CH) (c).
Figure 3. Spatial distribution of the sampling points within the study area using a stratified systematic unaligned sampling design (a) and field data collection: aboveground biomass (AGB) (b) and canopy height (CH) (c).
Remotesensing 12 02534 g003
Figure 4. Subsets of the texture measures derived from the NIR band using a window size of 7 × 7 pixels and the offset 90° for a PlanetScope imagery from May 2019: (a) mean (MEA), (b) variance (VAR), (c) homogeneity (HOM), (d) contrast (CON), (e) dissimilarity (DIS), (f) entropy (ENT), (g) second moment (2M), and (h) correlation (COR).
Figure 4. Subsets of the texture measures derived from the NIR band using a window size of 7 × 7 pixels and the offset 90° for a PlanetScope imagery from May 2019: (a) mean (MEA), (b) variance (VAR), (c) homogeneity (HOM), (d) contrast (CON), (e) dissimilarity (DIS), (f) entropy (ENT), (g) second moment (2M), and (h) correlation (COR).
Remotesensing 12 02534 g004
Figure 5. Scatterplots of the predicted versus measured values of pasture aboveground biomass (AGB, g m−2) (a) and canopy height (CH, m) (b) obtained using the best extreme gradient boosting (XGBoost) model based on the testing dataset. A 1:1 line (grey, dashed) is provided for reference.
Figure 5. Scatterplots of the predicted versus measured values of pasture aboveground biomass (AGB, g m−2) (a) and canopy height (CH, m) (b) obtained using the best extreme gradient boosting (XGBoost) model based on the testing dataset. A 1:1 line (grey, dashed) is provided for reference.
Remotesensing 12 02534 g005
Figure 6. Relative importance of the predictor variables as measured by the feature importance metric in the best extreme gradient boosting (XGBoost) models predicting aboveground biomass (AGB) (a) and canopy height (CH) (b).
Figure 6. Relative importance of the predictor variables as measured by the feature importance metric in the best extreme gradient boosting (XGBoost) models predicting aboveground biomass (AGB) (a) and canopy height (CH) (b).
Remotesensing 12 02534 g006
Figure 7. Pasture aboveground biomass (AGB) spatial maps predicted by the best extreme gradient boosting (XGBoost) model for the study area in the months of May (a), June (b), July (c), August (d), and November (e).
Figure 7. Pasture aboveground biomass (AGB) spatial maps predicted by the best extreme gradient boosting (XGBoost) model for the study area in the months of May (a), June (b), July (c), August (d), and November (e).
Remotesensing 12 02534 g007
Figure 8. Pasture canopy height (CH) spatial maps predicted by the best extreme gradient boosting (XGBoost) model for the study area in the months of May (a), June (b), July (c), August (d), and November (e).
Figure 8. Pasture canopy height (CH) spatial maps predicted by the best extreme gradient boosting (XGBoost) model for the study area in the months of May (a), June (b), July (c), August (d), and November (e).
Remotesensing 12 02534 g008
Figure 9. Relative RMSE uncertainty (in %) of the estimated aboveground biomass (AGB) (a) and canopy height (CH) (b), with the polynomial function fitted.
Figure 9. Relative RMSE uncertainty (in %) of the estimated aboveground biomass (AGB) (a) and canopy height (CH) (b), with the polynomial function fitted.
Remotesensing 12 02534 g009
Figure 10. Estimated relative RMSE uncertainty (in %) maps for the aboveground biomass (AGB) estimation in the months of May (a), June (b), July (c), August (d), and November (e).
Figure 10. Estimated relative RMSE uncertainty (in %) maps for the aboveground biomass (AGB) estimation in the months of May (a), June (b), July (c), August (d), and November (e).
Remotesensing 12 02534 g010
Figure 11. Estimated relative RMSE uncertainty (in %) maps for the canopy height (CH) estimation in the months of May (a), June (b), July (c), August (d), and November (e).
Figure 11. Estimated relative RMSE uncertainty (in %) maps for the canopy height (CH) estimation in the months of May (a), June (b), July (c), August (d), and November (e).
Remotesensing 12 02534 g011
Table 1. Descriptive statistics of the pasture aboveground biomass (AGB) and canopy height (CH) for the field campaigns.
Table 1. Descriptive statistics of the pasture aboveground biomass (AGB) and canopy height (CH) for the field campaigns.
Target
Variable
Field CampaignField-Sampled PointsProportion of Millet: Ruzi Grass (%)Descriptive Statistics
MeanStdDevMinMaxCV
AGB
(g m−2)
May10079:21209.5310.8270.33656.08 5.16
June5044:56136.7554.9161.58259.0240.15
July10014:86106.9539.0541.08241.9936.51
August3803:97162.6153.6886.85336.333.01
November580:100202.1965.26107.30 401.10 32.28
All data346-163.1981.9741.08656.0850.23
CH (m)May10079:210.820.160.371.2019.52
June5044:560.330.100.220.5729.77
July10014:860.290.100.140.6834.32
August3803:970.270.060.160.3823.44
November580:1000.200.050.120.3724.50
All data346-0.430.270.121.2063.34
Where StdDev = Standard Deviation, Min = minimum, Max = maximum, and CV = coefficient of variation.
Table 2. Summary of the vegetation indices (VIs) used in this study for pasture aboveground biomass (AGB) and canopy height (CH) estimation.
Table 2. Summary of the vegetation indices (VIs) used in this study for pasture aboveground biomass (AGB) and canopy height (CH) estimation.
IndexNameFormulaReference
ARVIAtmospherically Resistant Vegetation Index ( NIR 2 R + B ) / ( NIR + 2 R B ) [39]
BGNDBlue Green Normalized Difference ( G B ) / ( G + B ) -
DVIDifference Vegetation Index NIR R [40]
EVIEnhanced Vegetation Index 2.5 ( ( NIR R ) / ( NIR + 6 R 7.5 B + 1 ) [41]
EVI2Enhanced Vegetation Index 2 ( 2.5 ( NIR R ) ) / ( 1 + NIR + 2.4 R ) [42]
ExBExcess Blue Vegetation Index 1.4 B G [43]
ExGExcess Green Vegetation Index 2 G R B [44]
ExGRExcess Green minus Excess Red
Vegetation Index
ExG ExR [45]
ExRExcess Red Vegetation Index 1.4 R G [46]
GLAGreen Leaf Algorithm ( 2 G R B ) / ( 2 G + R + B ) [47]
GNDVIGreen Normalized Difference Vegetation Index ( NIR G ) / ( NIR + G ) [41]
GRVIGreen Ratio Vegetation Index G / R [48]
IPVIInfrared Percentage Vegetation Index NIR / ( NIR + R ) [49]
MGRDIModified Green Red Vegetation Index ( G 2 R 2 ) / ( G 2 + R 2 ) [40]
MSAVIModified Soil-Adjusted Vegetation Index [ ( 2 NIR + 1 ) ( 2 NIR + 1 ) 2 8 ( NIR R ) ] / 2 [50]
NDVINormalized Difference Vegetation Index ( NIR R ) / ( NIR + R ) [51]
NGRDINormalized Green-Red Difference Index ( G R ) / ( G + R ) [40]
NIR/GREENNIR Green Simple Ratio NIR / G -
OSAVIOptimized Soil-Adjusted Vegetation Index ( NIR R ) / ( NIR + R + 0.16 ) [52]
RGBVIRed Green Blue Vegetation Index ( G 2   B * R ) / ( G 2 + B * R ) [53]
RVIRatio Vegetation Index R / NIR [54]
SAVISoil-Adjusted Vegetation Index [ ( NIR R ) / ( NIR + R + 0.5 ) ] . ( 1 + 0.5 [55]
SRSimple Ratio NIR / R [56]
VARIVisible Atmospherically
Resistant Index
( G R ) / ( G + R B ) [57]
Table 3. Hyperparameters optimized for the XGBoost and RF algorithms and corresponding candidate value ranges for the random search.
Table 3. Hyperparameters optimized for the XGBoost and RF algorithms and corresponding candidate value ranges for the random search.
AlgorithmHyperparameterDescriptionCandidate Value Ranges
XGBoostnroundscontrols the maximum number of iterations50–200
etacontrols the learning rate0.01–0.3
max_depthcontrols the depth of the tree3–10
min_child_weightrefers to minimum number of instances required in a child node1–10
subsamplecontrols the number of observations supplied to a tree0.5–1.0
colsample_bytreecontrols the number of predictor variables supplied to a tree0.5–1.0
RFntreecontrols the number of trees50–500
mtrycontrols for the number of predictor variables randomly sampled at each split1– p 2
Where p = number of predictor variables.
Table 4. Performance of PlanetScope-derived texture measures in predicting pasture aboveground biomass (AGB) and canopy height (CH) using the random forest (RF) and extreme gradient boosting (XGBoost) regression algorithms based on the testing dataset. The optimum combination of texture parameters are highlighted in boldface.
Table 4. Performance of PlanetScope-derived texture measures in predicting pasture aboveground biomass (AGB) and canopy height (CH) using the random forest (RF) and extreme gradient boosting (XGBoost) regression algorithms based on the testing dataset. The optimum combination of texture parameters are highlighted in boldface.
Modelling
Algorithm
Window
Size
Offset
(θ)
AGBCH
RMSE (g m−2)RMSE (%)R2RMSE (m)RMSE (%)R2
RF3 × 353.3134.360.470.1022.730.87
45°50.7732.720.510.1022.910.87
90°48.4931.250.520.1023.090.87
135°51.5433.220.500.1022.370.87
5 × 549.7232.050.510.1023.000.87
45°49.6131.970.510.1023.550.87
90°47.7930.800.540.1022.840.87
135°47.4630.590.560.1022.180.88
7 × 746.4329.930.570.1022.34 0.87
45°45.6029.390.580.1023.250.86
90°44.0928.420.600.1022.890.87
135°46.4229.920.560.1022.50 0.87
XGBoost3 × 351.5633.230.460.1022.630.87
45°49.1231.660.520.1022.300.87
90°47.8430.840.530.1022.880.87
135°49.8232.110.490.1022.210.88
5 × 549.0231.590.520.1022.37 0.87
45°47.5330.630.540.1022.200.87
90°47.5030.610.540.1021.640.88
135°48.3831.180.520.0920.940.89
7 × 746.1929.770.560.0921.310.88
45°45.2229.150.580.1022.020.87
90°41.1526.520.650.1022.070.88
135°47.1230.370.550.0921.110.89
Table 5. Pasture aboveground biomass (AGB) and canopy height (CH) estimates based on the spectral bands (SC1), the vegetation indices (SC2), the optimum texture measures (SC3), and the combination of spectral bands, vegetation indices, and the optimum texture measures (SC4) using the random forest (RF) and extreme gradient boosting (XGBoost) regression algorithms. The best performing models are highlighted in boldface.
Table 5. Pasture aboveground biomass (AGB) and canopy height (CH) estimates based on the spectral bands (SC1), the vegetation indices (SC2), the optimum texture measures (SC3), and the combination of spectral bands, vegetation indices, and the optimum texture measures (SC4) using the random forest (RF) and extreme gradient boosting (XGBoost) regression algorithms. The best performing models are highlighted in boldface.
Modelling
Algorithm
Prediction
Scenario
AGBCH
RMSE (g m−2)RMSE (%)R2RMSE (m)RMSE (%)R2
RFSC152.4733.820.460.1023.700.86
SC250.9332.830.480.1022.520.87
SC344.0928.420.600.1022.180.88
SC446.4929.960.550.1022.500.87
XGBoostSC149.7632.070.500.1023.460.86
SC248.9931.580.500.1022.610.87
SC341.1526.520.650.0920.940.89
SC445.8529.550.570.1021.610.88

Share and Cite

MDPI and ACS Style

Dos Reis, A.A.; Werner, J.P.S.; Silva, B.C.; Figueiredo, G.K.D.A.; Antunes, J.F.G.; Esquerdo, J.C.D.M.; Coutinho, A.C.; Lamparelli, R.A.C.; Rocha, J.V.; Magalhães, P.S.G. Monitoring Pasture Aboveground Biomass and Canopy Height in an Integrated Crop–Livestock System Using Textural Information from PlanetScope Imagery. Remote Sens. 2020, 12, 2534. https://doi.org/10.3390/rs12162534

AMA Style

Dos Reis AA, Werner JPS, Silva BC, Figueiredo GKDA, Antunes JFG, Esquerdo JCDM, Coutinho AC, Lamparelli RAC, Rocha JV, Magalhães PSG. Monitoring Pasture Aboveground Biomass and Canopy Height in an Integrated Crop–Livestock System Using Textural Information from PlanetScope Imagery. Remote Sensing. 2020; 12(16):2534. https://doi.org/10.3390/rs12162534

Chicago/Turabian Style

Dos Reis, Aliny A., João P. S. Werner, Bruna C. Silva, Gleyce K. D. A. Figueiredo, João F. G. Antunes, Júlio C. D. M. Esquerdo, Alexandre C. Coutinho, Rubens A. C. Lamparelli, Jansle V. Rocha, and Paulo S. G. Magalhães. 2020. "Monitoring Pasture Aboveground Biomass and Canopy Height in an Integrated Crop–Livestock System Using Textural Information from PlanetScope Imagery" Remote Sensing 12, no. 16: 2534. https://doi.org/10.3390/rs12162534

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop