Next Article in Journal
Analyzing Variations in the Association of Eurasian Winter–Spring Snow Water Equivalent and Autumn Arctic Sea Ice
Next Article in Special Issue
Estimation of Winter Wheat Tiller Number Based on Optimization of Gradient Vegetation Characteristics
Previous Article in Journal
Estimating Carbon, Nitrogen, and Phosphorus Contents of West–East Grassland Transect in Inner Mongolia Based on Sentinel-2 and Meteorological Data
Previous Article in Special Issue
Estimation of Cotton Leaf Area Index (LAI) Based on Spectral Transformation and Vegetation Index
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Comparison of Multi-Methods for Identifying Maize Phenology Using PhenoCams

1
College of Water Sciences, Beijing Normal University, Beijing 100875, China
2
Department of Biology, Antwerp University, 2000 Antwerp, Belgium
3
Key Laboratory of Land Surface Pattern and Simulation, Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Sciences, Beijing 100101, China
4
Heilongjiang Province Key Laboratory of Geographical Environment Monitoring and Spatial Information Service in Cold Regions, Heilongjiang Province Collaborative Innovation Center of Cold Region Ecological Safety, School of Geographical Sciences, Harbin Normal University, Harbin 150025, China
5
Department of Geography and Environmental Sustainability, The University of Oklahoma, 100 East Boyd Street, Norman, OK 73019, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2022, 14(2), 244; https://doi.org/10.3390/rs14020244
Submission received: 2 December 2021 / Revised: 28 December 2021 / Accepted: 31 December 2021 / Published: 6 January 2022

Abstract

:
Accurately identifying the phenology of summer maize is crucial for both cultivar breeding and fertilizer controlling in precision agriculture. In this study, daily RGB images covering the entire growth of summer maize were collected using phenocams at sites in Shangqiu (2018, 2019 and 2020) and Nanpi (2020) in China. Four phenological dates, including six leaves, booting, heading and maturity of summer maize, were pre-defined and extracted from the phenocam-based images. The spectral indices, textural indices and integrated spectral and textural indices were calculated using the improved adaptive feature-weighting method. The double logistic function, harmonic analysis of time series, Savitzky–Golay and spline interpolation were applied to filter these indices and pre-defined phenology was identified and compared with the ground observations. The results show that the DLF achieved the highest accuracy, with the coefficient of determination (R2) and the root-mean-square error (RMSE) being 0.86 and 9.32 days, respectively. The new index performed better than the single usage of spectral and textural indices, of which the R2 and RMSE were 0.92 and 9.38 days, respectively. The phenological extraction using the new index and double logistic function based on the PhenoCam data was effective and convenient, obtaining high accuracy. Therefore, it is recommended the adoption of the new index by integrating the spectral and textural indices for extracting maize phenology using PhenoCam data.

Graphical Abstract

1. Introduction

Human activities, climate change and extreme weather events have induced irregular precipitation, elevated air temperature and aroused a potential global food crisis [1,2,3,4,5]. Global warming has advanced the spring vegetation phenology, leading to more uniformity across different tree species and variations in geographic elevations [6,7,8,9,10,11].
Both ground observations and model simulations have indicated that major crops such as maize and wheat have been negatively affected by climate change in the past decades [12,13]. Summer maize is one of the dominate crops worldwide and its stable production is crucial for guaranteeing regional and global food security [14,15,16,17]. There is a need to increase maize production by timely monitoring its growth condition and employing adaptive management to reduce the negative impacts of climate change [18,19,20]. Generally, the phenology of summer maize can be divided into the vegetative growth stage (from sowing to tasseling) and the reproductive growth stage (from tasseling to maturity). Climate change- and management practice-induced variations in crop phenology have changed the total length of the entire growing season (days from sowing to maturity), altering the growing degree days (GDDs) [21,22,23]. Thus, phenology is believed to have important impacts on maize production and it is also an important influencing indicator that controls carbon allocation to organs during different growth stages [22,24]. Therefore, timely and precise monitoring of maize phenology is crucial for reducing the negative impacts of climate change, guaranteeing potential agricultural yields and reducing the food crisis [14,25,26,27].
Field observation, crop models and remote sensing approaches are the three most applied methods for identifying the phenology of forests and crops [28,29]. Field observation mainly consists of the manual recording of detailed phenological events, such as budburst, flowering and leaf coloring, covering the entire growth stages of vegetation [9,30]. However, field observation is time consuming and labor intensive and there is lack of recording of the continuous phenological developments during the long-term growth cycle under a standard procedure [31]. Alternatively, a crop model such as CERES–maize can be conducted to simulate the daily dynamic growth of crops with high accuracy, but most models relay on highly resolved data, including daily precipitation, temperature, soil property, management practices and crop cultivars [32,33,34,35,36]. Besides, models can hardly handle extreme events, as the in-built mechanisms have no consideration of those scenarios [37,38]. To date, many previous studies have proved that time-saving and cost-effective remote sensing techniques can be applied for phenological extraction with high accuracy [39,40,41,42]. Combined with limited numbers of field observations, remote sensing can obtain high accuracy of regressions and classifications [43,44,45]. Phenocams are digital cameras that can obtain high temporal and spatial resolutions of images concerning ecosystem dynamics; then, phenological events can be derived from these high-resolution datasets [46,47]. As a part of a near-surface remote sensing approach, phenocams observed detailed information of the growing period of vegetation on an hourly scale and massive data were generated [48]. To date, many studies have reported phenological extractions using phenocams. Zhang et al. (2018) retrieved six phenological events (green-up onset, mid-green-up phase, maturity onset, senescence onset, mid-senescence phase and dormancy onset) of vegetation using images from a phenocam network and the absolute difference in senescence onset was less than a week when compared with the land surface phenology (LSP) product derived from a moderate resolution imaging spectroradiometer (MODIS) [47]. Xie et al. (2018) obtained the in situ phenological events of deciduous tree canopies at both individual and species levels using a phenocam and compared them with the results derived from ground observations, exploring the species-specific phenological responses to climatic variations [49]. Klosterman et al. (2014) extracted the phenophase transition dates at 13 temperate deciduous forest sites throughout eastern North America using phenocams and the results showed that the phenological dates derived from high-frequency phenocams had smaller uncertainties than those derived from MODIS and Advanced Very-High-Resolution Radiometer (AVHRR) sensors [50]. Richardson et al. (2007) investigated the potential ability of phenocams by monitoring the trajectory of spring green-up in a deciduous northern hardwood forest and found that the phenocam network could be deployed to record phenology at regional and national scales [51]. Alberton et al. (2017) monitored leaf phenology using phenocams in tropical vegetation and found that phenocams were potential tools for conservation biology [52]. Liu et al. (2017) found that the normalized difference vegetation indices (NDVIs) of Landsat, MODIS and VIIRS were strongly correlated with the NDVIs calculated from phenocams and the phenological dates in the relatively homogeneous open grasslands were consistent across different spatial resolutions [53]. These studies have contributed to the application of phenocams for the extraction of vegetation (mainly forest) phenology; however, crop phenological extraction has been little investigated. Therefore, there is a necessity to explore whether phenocams and the existing methods developed for extracting forest phenology are still effective for crops.
Textural information is really helpful for improving the accuracy of image classification, image segmentation, change detection, yield prediction and pattern cognition [26,54,55,56,57]. Lu et al. (2005) found that the combination of spectral and textural variables can improve the accuracy of ground biomass estimation using Landsat TM imagery [58]. Yang et al. (2021) assessed the potential ability of combining textural and spectral information for improving the estimation accuracy of the leaf area index (LAI) throughout the entire growing season of rice [59]. Peña-Barragán et al. (2011) incorporated spectral, textural and hierarchical features after segmentation of imagery and object-based image analysis (OBIA) techniques with decision tree (DT) algorithms to assess crop types and field status [60]. To date, the potential ability of textural information for phenological extraction has been little investigated and there is no report combing spectral and textural information for the identification of crop phenology using phenocams.
In this study, detailed information of summer maize during the entire growing season was recorded using phenocams and its phenology was extracted and compared with ground observations. Spectral information was extracted using different spectral indices and textural information was derived from the gray-level co-occurrence matrix (GLCM). Then, the time series of independent spectral information, textural information and combined spectral and textural information using the improved adaptive feature-weighting method (IAFWM) were separately filtered using the double logistic function (DLF), the harmonic analysis of time series (HANTS), the Savitzky–Golay (SG) and the spline interpolation (SI). Maize phenology was extracted based on the filtered indices and the results were compared with ground observations. The objectives of this paper are to (1) assess the performance of different filtering models for extracting maize phenology, (2) explore whether the added textural information is effective for phenological extraction and (3) generate a new index using the IAFWM by integrating spectral and textural indices and evaluate its effectiveness for phenological extraction.

2. Material and Methods

2.1. Study Area

Phenocams images were collected concerning summer maize at two sites, Nanpi (Latitude, 38.00°N; Longitude, 116.40°E; Elevation, 34 m) and Shangqiu (Latitude, 34.51°N; Longitude, 115.59°E; Elevation, 52 m) (Figure 1). Both sites are located in the North China Plain (NCP), which is the main agricultural production area of summer maize and winter wheat in China [61,62]. The two sites are well-managed key field scientific observation sites and farmland research stations; they were built for investigating the impacts of climate change, fertilizers and irrigations for crops. Generally, summer maize is grown in late June and harvested in early October and winter wheat is sowed in early November and harvested in early June. Summer maize was the dominate type in the NCP and maize was used for all the next pages. The annual average temperature and precipitation for Nanpi and Shangqiu are 12 and 13.9 ℃ and 500 and 708 mm, respectively.

2.2. Phenocam and Image Collection

Phenocams were applied to collect RGB images concerning maize covering the entire growing season. The phenocam was preset before the growing season to collect RGB images in Nanpi (2020) and Shangqiu (from 2018 to 2020). For Shangqiu, the phenocam was set in the same location in the years 2018 and 2019, then the camera was moved to a new location 400 m away from its original location on 14 May 2020 (https://phenocam.sr.unh.edu/webcam/sites/shangqiu/). For the data sampling at Shangqiu in 2018, the phenocam was set to capture one image every 15 min from 9:15 to 16:40, with a total of 30 images, for recording the growth of maize for a single day. For data collection in Shangqiu in 2019 and 2020, the phenocam was set to capture one image every 60 min from 8:00 to 17:00, with a sum of 10 images of maize for a single day. For the data collection in Nanpi in 2020, the phenocam was set to capture three images every 120 min from 8:00 to 14:00 and, totally, 12 images of maize for each day were obtained. Maize phenology included six leaves, booting, heading and maturity. The definition of the phenological events according to the phenological signals are shown in Figure 2.

2.3. Treatment and Ground Observations

Maize was treated with abundant fertilizers, including nitrogen (N), phosphorus (P) and potassium (K), and pest control was strictly conducted by professional workers under a strict procedure. The observed phenological aspects, including six leaves, booting, heading and maturity, of maize in Nanpi (2020) and Shangqiu (from 2018 to 2020) were recorded and provided in day-of-year (DOY) format (Table 1). These dates were confirmed when phenological events happened to 50% of the maize plants in the imaging area.

2.4. Spectral and Textural Indices

For the entire growing season of maize, at each site, in a single year, the RGB images were subtracted using the same ROI (Figure 1). The original digital numbers (DNs) were normalized (0–1) by dividing the sum of the r, g and b bands [63]. To avoid disturbances, the images were firstly classed into vegetation (maize) and non-vegetation (background, including soil and other disturbances) [26,64]. Four high-performing spectral indices were applied for their excellent performance reported in previous reports (Table 2). The average values of all percentiles, the average values between the 25th and 75th percentiles and the average values of the 90th percentile of the spectral indices in each day were calculated and compared [49,65].
Textural indices are important indicators in the object detection, image classification and image-processing domains [60,71,72,73,74]. The most commonly applied textural indices, such as contrast, correlation, energy and homogeneity, can be extracted from the gray-level co-occurrence matrix (GLCM) based on statistical approaches [75,76,77,78]. The textural indices applied in this study was shown in Table 3.
The impacts of the distance d were assessed by altering the size (length and width) of the calculation windows, 10 × 10, 20 × 20, 30 × 30, …, 100 × 100 pixels, and the optimal window size was confirmed by the advantages of time saving and higher accuracy. The values merely changed from 10 × 10 to 30 × 30 and they gradually decreased with the increase in window size from 30 × 30 to 100 × 100. Therefore, the window size of 30 × 30 was selected with the consideration of accuracy and computational efficiency. Similar to the spectral indices, the average values of all percentiles, the average values between the 25th and 75th percentiles and the average of the 90th percentile of the textural indices for each day were obtained and compared with each other.
The improved adaptive feature-weighting method (IAFWM) can measure the weights between spectral and textural information [25,26]. The new index was formed by considering the weights and the same processing procedure was also applied.

2.5. Extraction of Phenology of Maize

The Savitzky–Golay (SG), double logistic function (DLF), spline interpolation (SI) and HANTS method (HM) were independently applied to extract the maize phenology at the two sites each year [42,79,80]. The Gaussian method was applied to remove abnormal values of indices; then, the SG, DLF, SI and HM were applied for filtering the indices. For phenological extraction, the first derivative (FD) of the filtered indices was calculated and the DOY corresponding to the FD equaling 0.012 was defined as the six leaves’ date; the DOY corresponding to the maximum value of FD was defined as the booting date; the DOY corresponding to the maximum value of the filtered indices was defined as the heading date; and the minimum value of the first derivative was defined as the maturity date. These extracted phenological dates were compared with the ground observations and the effectiveness of the phenological extraction methods were evaluated by applying the coefficient of determination (R2) and the root-mean-square error (RMSE).

3. Results

3.1. The Variations in Spectral and Textural Indices during the Growth Period of Maize

The daily images were processed using the method introduced in Section 2.2. Since maize is an annual crop, the average values of all percentiles, average values between the 25th and 75th percentiles and average values of the 90th percentile of the spectral and textural indices of maize were obtained for each day during each annual growth cycle (Figure 3). It can be easily observed that the difference between spectral indices and textural indices of different percentiles was not obvious and this influence can be ignored. Thus, the average values of the spectral and textural indices for each day were selected for phenological extraction. The selected spectral indices (Figure 3a–d) represented well the dynamic growth condition of maize. The correlation (Figure 3f) also represented the dynamic growth condition of maize, while the indices contrast, energy and homogeneity failed to match the dynamic growth stages of maize. Thus, all the spectral indices and the correlation were selected for phenological extraction, as they performed well in describing the growth condition of maize.

3.2. The Forming of New Index and Filtering of All Indices

The IAFWM was conducted and the ratios between spectral indices and textural indices were 0.590, 0.905 and 0.959 for Shangqiu in 2018, 2019 and 2020, respectively. Similarly, the ratio was 0.779 for Nanpi in 2020. The new index was generated for each site for each year. Then, the new index was normalized by dividing the maximum values to make it consistent with the original spectral and textural indices. All the selected spectral and textural indices and the new index obtained by integrating the spectral and textural indices were independently filtered using the DLF, HANTS, SG and SI (Figure 4). It could be observed that the curves of the filtered indices were totally different when different filtering methods were applied.

3.3. Phenological Extraction Using Different Filtering Methods

The filtered indices were applied to extract phenological aspects, including six leaves, booting, heading and maturity, of maize for each site in each year. To better show the difference of phenological extraction using different filtering methods, the extracted phenological aspects (six leaves, booting, heading and maturity) were compared with observed phenology for all sites and all years (Figure 5). It can be acknowledged that the DLF achieved the highest accuracy, with the R2 and RMSE each being 0.86 and 9.32 days. Therefore, the DLF performed best in extracting maize phenology, followed by SG, with the R2 and RMSE being 0.83 and 9.60 days, respectively. The R2 (RMSE) was 0.80 (10.05 days) and 0.75 (12.93 days), using the HANTS and SI, respectively. Thus, the DLF and SG performed better than the HANTS and SI and the DLF achieved the highest accuracy in maize phenological extraction.
To better show the difference in phenological extraction using different filtering models, the R2 and RMSE of extracted and measured maize phenology were calculated for different sites and different years (Figure 6). It could be easily observed that the DLF performed best in 2018 and 2020, as the R2 was the highest, and the same R2 was obtained using the DLF and SG in 2019. Similar results were found for the indicator RMSE, with the lowest values for the DLF, followed by the SG, HANTS and SI, in sequence. To better show the performance of the filtering methods, the results are shown for each site and each year (Figure A1, Figure A2, Figure A3 and Figure A4).

3.4. Phenological Extraction Using Different Indices

Maize phenological extraction using only spectral indices, textural indices and the new index were each assessed using data from all sites and all years, as well as using different sites and different years (Figure 7). The extracted phenology was separately compared with ground observations. It can be acknowledged that the new index obtained the highest accuracy, with the R2 and RMSE each being 0.917 and 9.381 days. The new index obtained the highest R2 for Shangqiu in 2019 and 2020 and for Nanpi in 2020. Similarly, the new index achieved the lowest RMSE for Shangqiu in 2019 and 2020 and for Nanpi in 2020, respectively. Thus, it can be concluded that the new index, integrated by the spectral and textural information, obtained the highest accuracy for maize phenological extraction. Meanwhile, it was also observed that the indicator correlation performed worst among all indicators. Therefore, it can be guessed that the lower performance of phenological extraction in 2019 and 2020 may have been induced by the indicator correlation.

4. Discussion

4.1. Comparison of Phenological Extraction with Previous Studies

In this study, phenocam images concerning maize were innovatively applied to calculate the spectral and textural indices for maize phenological extraction. Maize phenological aspects, including six leaves, booting, heading and maturity, were pre-defined, extracted and compared with ground observations. Four different filtering methods were independently applied and it was found that the DLF achieved the highest accuracy, with an R2 and an RMSE being 0.92 and 9.38 days, respectively. This may be because the real growth of vegetation was more similar to the forms of the DLF [81,82,83,84]. The current result is slightly more accurate than the reported extraction of start of green up (SOG), which showed a wide range of 14 days using the RGB from phenocams for soybean and between 6 and 7 days for the other phenophases [65]. Liu et al. defined the start of growing season of winter wheat and extracted the phenological aspects based on the filtered GCC using the double logistic model and the RMSE was about 10 days [85]. Another study reported that the RMSE for the start of season (SOS) and end of season (EOS) ranged from 3.3 to 5.5 days and from 3.0 to 11.6 days, respectively, for deciduous forest tree canopies in nine different sites at the University of Connecticut campus in Mansfield, Connecticut, USA [49]. The detailed comparison of the phenological extraction in this study with previous studies is specifically shown in Table A1. Previous studies mainly focused on the extraction of SOS and EOS of forests and only the SOS or the EOS were commonly extracted, while, in the current study, four important phenological events of maize were extracted with a similar R2 or RMSE. Therefore, the proposed new index and the DLF achieved relatively high accuracy for maize phenological extraction. Besides, our findings indicate that the proposed method (new index with DLF) was a relatively impressive and convenient way for phenological extraction.

4.2. Performance of the New Index

The new index, which integrated spectral and textural indices, achieved the highest accuracy. The new index combined both the advantages of spectrum and texture and it could precisely catch the dynamic growth condition, as well as describing the detailed variations in the growth status of maize. This may be because textural information can express and describe the dynamic growth of maize in a different way compared with spectral information [26,86,87]. The new index combines the advantages of both spectral and textural information; the added textural information improved the accuracy of maize phenological extraction. Thus, it is highly recommended the potential usage of the new index for identifying the phenology of other crops, such as winter wheat and soybean. Besides, crop phenotypes, such as crop height and diameter, blade length, blade, leaf length, leaf width, leaf area and leaf inclination, are also important indicators for expressing the dynamic growth of crops and their potential usage for identifying the crop phenology should also be well investigated [88,89,90].

4.3. Limitations

Phenocams can be deployed to monitor the dynamic growth condition of vegetation (forest and crop) by acquiring data in relatively high temporal and spatial resolutions [91,92]. Currently, the approach was limited to some basic traits that should be well introduced. Firstly, the PhenoCams deployed in this study were of commercial use, the bandwidths of which are typically too wide to identify the absorption features of pigments [29,93]. Therefore, continuous data collection using multi-spectral cameras might be more suitable for monitoring the phenology of vegetation [94,95,96,97]. Multi-spectral images can be applied to extract multi-spectral-based indices such as NDVI and EVI and a camera with narrow bands could also be used to calculate the solar-induced chlorophyll fluorescence (SIF) [98,99,100,101,102,103]. Thus, the potential ability of phenological extraction using multi-spectral images could be tested and explored in further analyses. Secondly, the viewing geometry changes continuously across the whole image and, consequently, across the field, using phenocams [65,104]. The viewing geometry is also related to the elevation and the viewing angle of the phenocam. It is really hard to monitor the full study area with only one phenocam when the study area is relatively large [105]. Fortunately, the easy deployment of unmanned aerial vehicles/systems (UAV/UASs) has provided us with a new perspective for acquiring high-throughput data for the phenotyping of vegetations (forest and crop) [106,107,108]. There is an increased need to explore the potential of UAV/UAS-based data for phenological extraction [109,110,111,112]. Images from UAV/UASs and phenocams are captured from different viewing angles and the data can be combined to integrate the advantages, as the data are complementary for both platforms [48,113]. Thirdly, the methods for phenological extractions rely on filtered indices; however, the results of the extracted phenological features are very sensitive to the input parameters of the filtering models [80,114]. Machine learning methods such as SVM, RF and BP and deep learning approaches such as alexnet and googlenet are excellent models for classification and they can be applied to perform the identification of maize phenology [115,116,117,118,119,120]. Nevertheless, phenocams provide an important approach for monitoring the phenological processes of vegetation, by analyzing the agricultural yield of crops by the high-throughput (high temporal and high spatial) data [121,122,123].

5. Conclusions

In this study, the time series of RGB images retrieved from phenocams were applied to extract spectral indices, textural indices and a new index by integrating spectral and textural indices. The indices were separately filtered using four commonly applied methods, double logistic function, harmonic analysis of time series, Savitzky–Golay and spline interpolation. Six leaves, booting, heading and maturity of maize were identified based on the filtered indices and the results were compared with observed phenology from ground observations. The double logistic function achieved the highest accuracy compared with the HANTS, SG and SI. The new index, obtained by integrating the spectral and textural indices by the improved adaptive feature-weighting method, performed best. Therefore, it is highly recommended the adoption of the new index by integrating spectral and textural indices with the double logistic function when identifying the phenology of maize using phenocams.

Author Contributions

Conceptualization, Y.G. and S.C.; methodology, Y.G. and W.W.; software, Y.X.; validation, Y.G., S.C. and Y.X.; formal analysis, Y.G., S.C. and Y.X.; investigation, Y.G., S.C. and Y.H.F.; resources, Y.H.F. and W.W.; data curation, Y.G., S.C. and Y.X.; writing—original draft preparation, Y.G.; writing—review and editing, Y.G., Y.H.F., H.W., W.W. and K.d.B.; visualization, Y.G. and Y.X.; supervision, Y.H.F. and W.W.; project administration, Y.H.F. and W.W.; funding acquisition, Y.H.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

This study was funded by the National Funds for Distinguished Young Youths (Grant No. 42025101), the National Natural Science Foundation of China (Grant No. 31770516) and the 111 Project (Grant No. B18006). We appreciate Yongguang Zhang for sharing the phenocam data of maize at the site in Shangqiu and we thank Hongyong Sun for his support in phenocam data collection at the site in Nanpi. We also thank all the reviewers for their useful comments and suggestions in improving the quality of the paper.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Figure A1. Comparison of extracted and observed phenology from ground observations of maize using different filtering methods at the site in Shangqiu in 2018. Note: (ad) represented the comparison of extracted phenology using DLF, HANTS, SG, and SI, respectively.
Figure A1. Comparison of extracted and observed phenology from ground observations of maize using different filtering methods at the site in Shangqiu in 2018. Note: (ad) represented the comparison of extracted phenology using DLF, HANTS, SG, and SI, respectively.
Remotesensing 14 00244 g0a1
Figure A2. Comparison of extracted and observed phenology from ground observations of maize using different filtering methods at the site in Shangqiu in 2019. Note: (ad) represented the comparison of extracted phenology using DLF, HANTS, SG, and SI, respectively.
Figure A2. Comparison of extracted and observed phenology from ground observations of maize using different filtering methods at the site in Shangqiu in 2019. Note: (ad) represented the comparison of extracted phenology using DLF, HANTS, SG, and SI, respectively.
Remotesensing 14 00244 g0a2
Figure A3. Comparison of extracted and observed phenology from ground observations of maize using different filtering methods at the site in Shangqiu in 2020. Note: (ad) represented the comparison of extracted phenology using DLF, HANTS, SG, and SI, respectively.
Figure A3. Comparison of extracted and observed phenology from ground observations of maize using different filtering methods at the site in Shangqiu in 2020. Note: (ad) represented the comparison of extracted phenology using DLF, HANTS, SG, and SI, respectively.
Remotesensing 14 00244 g0a3
Figure A4. Comparison of extracted and observed phenology from ground observations of maize using different filtering methods at the site in Nanpi in 2020. Note: (ad) represented the comparison of extracted phenology using DLF, HANTS, SG, and SI, respectively.
Figure A4. Comparison of extracted and observed phenology from ground observations of maize using different filtering methods at the site in Nanpi in 2020. Note: (ad) represented the comparison of extracted phenology using DLF, HANTS, SG, and SI, respectively.
Remotesensing 14 00244 g0a4
Table A1. Comparison of phenological extraction with previous studies.
Table A1. Comparison of phenological extraction with previous studies.
StudyCropFiltering MethodPhenological EventsR2RMSE (Days)
Helge Aasen et al.SoybeanSavitzky–Golay filterStart of green up (SOG)0.7814
Yingying Xie et al.Deciduous forest treeLogistic curvesStart of season, end of season---From 3.3 to 5.5, from 3.0 to 11.6
Yujie Liu et al.Winter wheatDouble logisticStart of growing season, stabilization date, position of peak greenness, downturn date, end of season---10
This studySummer maizeDouble logistic function, harmonic analysis of time series, Savitzky–Golay and spline interpolationSix leaves, booting, heading, maturity0.929.38
Note: ‘---’ indicates no record in the paper.

References

  1. Schlenker, W.; Lobell, D.B. Robust negative impacts of climate change on African agriculture. Environ. Res. Lett. 2010, 5, 014010. [Google Scholar] [CrossRef]
  2. Ramirez-Villegas, J.; Jarvis, A.; Läderach, P. Empirical approaches for assessing impacts of climate change on agriculture: The EcoCrop model and a case study with grain sorghum. Agric. For. Meteorol. 2013, 170, 67–78. [Google Scholar] [CrossRef]
  3. Lobell, D.B.; Schlenker, W.; Costa-Roberts, J. Climate trends and global crop production since 1980. Science 2011, 333, 616–620. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Zhao, C.; Liu, B.; Piao, S.; Wang, X.; Lobell, D.B.; Huang, Y.; Huang, M.; Yao, Y.; Bassu, S.; Ciais, P. Temperature increase reduces global yields of major crops in four independent estimates. Proc. Natl. Acad. Sci. USA 2017, 114, 9326–9331. [Google Scholar] [CrossRef] [Green Version]
  5. Howden, S.M.; Soussana, J.-F.; Tubiello, F.N.; Chhetri, N.; Dunlop, M.; Meinke, H. Adapting agriculture to climate change. Proc. Natl. Acad. Sci. USA 2007, 104, 19691–19696. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Vitasse, Y.; Signarbieux, C.; Fu, Y.H. Global warming leads to more uniform spring phenology across elevations. Proc. Natl. Acad. Sci. USA 2018, 115, 1004–1008. [Google Scholar] [CrossRef] [Green Version]
  7. Geng, X.; Fu, Y.H.; Hao, F.; Zhou, X.; Zhang, X.; Yin, G.; Vitasse, Y.; Piao, S.; Niu, K.; De Boeck, H.J. Climate warming increases spring phenological differences among temperate trees. Glob. Chang. Biol. 2020, 26, 5979–5987. [Google Scholar] [CrossRef] [PubMed]
  8. Fu, Y.H.; Zhang, X.; Piao, S.; Hao, F.; Geng, X.; Vitasse, Y.; Zohner, C.; Peñuelas, J.; Janssens, I.A. Daylength helps temperate deciduous trees to leaf-out at the optimal time. Glob. Chang. Biol. 2019, 25, 2410–2418. [Google Scholar] [CrossRef] [PubMed]
  9. Piao, S.; Liu, Q.; Chen, A.; Janssens, I.A.; Fu, Y.; Dai, J.; Liu, L.; Lian, X.; Shen, M.; Zhu, X. Plant phenology and global climate change: Current progresses and challenges. Glob. Chang. Biol. 2019, 25, 1922–1940. [Google Scholar] [CrossRef]
  10. Wu, Z.; Chen, S.; De Boeck, H.J.; Stenseth, N.C.; Tang, J.; Vitasse, Y.; Wang, S.; Zohner, C.; Fu, Y.H. Atmospheric brightening counteracts warming-induced delays in autumn phenology of temperate trees in Europe. Glob. Ecol. Biogeogr. 2021, 30, 2477–2487. [Google Scholar] [CrossRef]
  11. Li, P.; Liu, Z.; Zhou, X.; Xie, B.; Li, Z.; Luo, Y.; Zhu, Q.; Peng, C. Combined control of multiple extreme climate stressors on autumn vegetation phenology on the Tibetan Plateau under past and future climate change. Agric. For. Meteorol. 2021, 308, 108571. [Google Scholar] [CrossRef]
  12. Qian, B.; Zhang, X.; Smith, W.; Grant, B.; Jing, Q.; Cannon, A.J.; Neilsen, D.; McConkey, B.; Li, G.; Bonsal, B. Climate change impacts on Canadian yields of spring wheat, canola and maize for global warming levels of 1.5 C, 2.0 C, 2.5 C and 3.0 C. Environ. Res. Lett. 2019, 14, 074005. [Google Scholar] [CrossRef]
  13. He, Q.; Zhou, G.; Lü, X.; Zhou, M. Climatic suitability and spatial distribution for summer maize cultivation in China at 1.5 and 2.0° C global warming. Sci. Bull. 2019, 64, 690–697. [Google Scholar] [CrossRef] [Green Version]
  14. Zhu, W.; Sun, Z.; Peng, J.; Huang, Y.; Li, J.; Zhang, J.; Yang, B.; Liao, X. Estimating maize above-ground biomass using 3D point clouds of multi-source unmanned aerial vehicle data at multi-spatial scales. Remote Sens. 2019, 11, 2678. [Google Scholar] [CrossRef] [Green Version]
  15. Jones, P.G.; Thornton, P.K. The potential impacts of climate change on maize production in Africa and Latin America in 2055. Glob. Environ. Chang. 2003, 13, 51–59. [Google Scholar] [CrossRef]
  16. Cairns, J.E.; Hellin, J.; Sonder, K.; Araus, J.L.; MacRobert, J.F.; Thierfelder, C.; Prasanna, B.M. Adapting maize production to climate change in sub-Saharan Africa. Food Secur. 2013, 5, 345–360. [Google Scholar] [CrossRef] [Green Version]
  17. Li, X.; Takahashi, T.; Suzuki, N.; Kaiser, H.M. The impact of climate change on maize yields in the United States and China. Agric. Syst. 2011, 104, 348–353. [Google Scholar] [CrossRef]
  18. Zhao, B.; Duan, A.; Ata-Ul-Karim, S.T.; Liu, Z.; Chen, Z.; Gong, Z.; Zhang, J.; Xiao, J.; Liu, Z.; Qin, A. Exploring new spectral bands and vegetation indices for estimating nitrogen nutrition index of summer maize. Eur. J. Agron. 2018, 93, 113–125. [Google Scholar] [CrossRef]
  19. Chen, X.; Mo, X.; Zhang, Y.; Sun, Z.; Liu, Y.; Hu, S.; Liu, S. Drought detection and assessment with solar-induced chlorophyll fluorescence in summer maize growth period over North China Plain. Ecol. Indic. 2019, 104, 347–356. [Google Scholar] [CrossRef]
  20. Zhang, M.; Zhu, D.; Su, W.; Huang, J.; Zhang, X.; Liu, Z. Harmonizing multi-source remote sensing images for summer corn growth monitoring. Remote Sens. 2019, 11, 1266. [Google Scholar] [CrossRef] [Green Version]
  21. Liu, Y.; Qin, Y.; Ge, Q. Spatiotemporal differentiation of changes in maize phenology in China from 1981 to 2010. J. Geogr. Sci. 2019, 29, 351–362. [Google Scholar] [CrossRef] [Green Version]
  22. Guo, Y.; Fu, Y.; Hao, F.; Zhang, X.; Wu, W.; Jin, X.; Bryant, C.R.; Senthilnath, J. Integrated phenology and climate in rice yields prediction using machine learning methods. Ecol. Indic. 2021, 120, 106935. [Google Scholar] [CrossRef]
  23. Tao, F.; Zhang, Z.; Shi, W.; Liu, Y.; Xiao, D.; Zhang, S.; Zhu, Z.; Wang, M.; Liu, F. Single rice growth period was prolonged by cultivars shifts, but yield was damaged by climate change during 1981–2009 in C hina, and late rice was just opposite. Glob. Chang. Biol. 2013, 19, 3200–3209. [Google Scholar] [CrossRef]
  24. Guo, Y.; Wu, W.; Liu, Y.; Wu, Z.; Geng, X.; Zhang, Y.; Bryant, C.R.; Fu, Y. Impacts of Climate and Phenology on the Yields of Early Mature Rice in China. Sustainability 2020, 12, 10133. [Google Scholar] [CrossRef]
  25. Guo, Y.; Chen, S.; Wu, Z.; Wang, S.; Robin Bryant, C.; Senthilnath, J.; Cunha, M.; Fu, Y.H. Integrating Spectral and Textural Information for Monitoring the Growth of Pear Trees Using Optical Images from the UAV Platform. Remote Sens. 2021, 13, 1795. [Google Scholar] [CrossRef]
  26. Guo, Y.; Fu, Y.H.; Chen, S.; Bryant, C.R.; Li, X.; Senthilnath, J.; Sun, H.; Wang, S.; Wu, Z.; de Beurs, K. Integrating spectral and textural information for identifying the tasseling date of summer maize using UAV based RGB images. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102435. [Google Scholar] [CrossRef]
  27. Herrmann, I.; Bdolach, E.; Montekyo, Y.; Rachmilevitch, S.; Townsend, P.A.; Karnieli, A. Assessment of maize yield and phenology by drone-mounted superspectral camera. Precis. Agric. 2020, 21, 51–76. [Google Scholar] [CrossRef]
  28. Zhang, S.; Dai, J.; Ge, Q. Responses of autumn phenology to climate change and the correlations of plant hormone regulation. Sci. Rep. 2020, 10, 1–10. [Google Scholar] [CrossRef] [PubMed]
  29. Li, Q.; Shen, M.; Chen, X.; Wang, C.; Chen, J.; Cao, X.; Cui, X. Optimal Color Composition Method for Generating High-Quality Daily Photographic Time Series From PhenoCam. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 6179–6193. [Google Scholar] [CrossRef]
  30. Wolf, A.A.; Zavaleta, E.S.; Selmants, P.C. Flowering phenology shifts in response to biodiversity loss. Proc. Natl. Acad. Sci. USA 2017, 114, 3463–3468. [Google Scholar] [CrossRef] [Green Version]
  31. Gnyp, M.L.; Miao, Y.; Yuan, F.; Ustin, S.L.; Yu, K.; Yao, Y.; Huang, S.; Bareth, G. Hyperspectral canopy sensing of paddy rice aboveground biomass at different growth stages. Field Crops Res. 2014, 155, 42–55. [Google Scholar] [CrossRef]
  32. Kim, M.; Ko, J.; Jeong, S.; Yeom, J.-m.; Kim, H.-O. Monitoring canopy growth and grain yield of paddy rice in South Korea by using the GRAMI model and high spatial resolution imagery. GIScience Remote Sens. 2017, 54, 534–551. [Google Scholar] [CrossRef]
  33. Guo, Y.; Wu, W.; Du, M.; Liu, X.; Wang, J.; Bryant, C.R. Modeling climate change impacts on rice growth and yield under global warming of 1.5 and 2.0 C in the Pearl River Delta, China. Atmosphere 2019, 10, 567. [Google Scholar] [CrossRef] [Green Version]
  34. Radanielson, A.; Gaydon, D.; Li, T.; Angeles, O.; Roth, C. Modeling salinity effect on rice growth and grain yield with ORYZA v3 and APSIM-Oryza. Eur. J. Agron. 2018, 100, 44–55. [Google Scholar] [CrossRef] [PubMed]
  35. Timsina, J.; Humphreys, E. Performance of CERES-Rice and CERES-Wheat models in rice–wheat systems: A review. Agric. Syst. 2006, 90, 5–31. [Google Scholar] [CrossRef]
  36. Fang, H.; Liang, S.; Hoogenboom, G.; Teasdale, J.; Cavigelli, M. Corn-yield estimation through assimilation of remotely sensed data into the CSM-CERES-Maize model. Int. J. Remote Sens. 2008, 29, 3011–3032. [Google Scholar] [CrossRef]
  37. Leblois, A.; Quirion, P. Agricultural insurances based on meteorological indices: Realizations, methods and research challenges. Meteorol. Appl. 2013, 20, 1–9. [Google Scholar] [CrossRef] [Green Version]
  38. Pirttioja, N.; Carter, T.R.; Fronzek, S.; Bindi, M.; Hoffmann, H.; Palosuo, T.; Ruiz-Ramos, M.; Tao, F.; Trnka, M.; Acutis, M. Temperature and precipitation effects on wheat yield across a European transect: A crop model ensemble analysis using impact response surfaces. Clim. Res. 2015, 65, 87–105. [Google Scholar] [CrossRef] [Green Version]
  39. De Beurs, K.M.; Henebry, G.M. Land surface phenology, climatic variation, and institutional change: Analyzing agricultural land cover change in Kazakhstan. Remote Sens. Environ. 2004, 89, 497–509. [Google Scholar] [CrossRef]
  40. Zhang, X.; Friedl, M.A.; Schaaf, C.B.; Strahler, A.H.; Hodges, J.C.; Gao, F.; Reed, B.C.; Huete, A. Monitoring vegetation phenology using MODIS. Remote Sens. Environ. 2003, 84, 471–475. [Google Scholar] [CrossRef]
  41. Xue, Z.; Du, P.; Feng, L. Phenology-driven land cover classification and trend analysis based on long-term remote sensing image series. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 1142–1156. [Google Scholar] [CrossRef]
  42. White, M.A.; de Beurs, K.M.; Didan, K.; Inouye, D.W.; Richardson, A.D.; Jensen, O.P.; O’keefe, J.; Zhang, G.; Nemani, R.R.; van Leeuwen, W.J. Intercomparison, interpretation, and assessment of spring phenology in North America estimated from remote sensing for 1982–2006. Glob. Chang. Biol. 2009, 15, 2335–2359. [Google Scholar] [CrossRef]
  43. Comber, A.; Fisher, P.; Brunsdon, C.; Khmag, A. Spatial analysis of remote sensing image classification accuracy. Remote Sens. Environ. 2012, 127, 237–246. [Google Scholar] [CrossRef] [Green Version]
  44. Yuan, H.; Yang, G.; Li, C.; Wang, Y.; Liu, J.; Yu, H.; Feng, H.; Xu, B.; Zhao, X.; Yang, X. Retrieving soybean leaf area index from unmanned aerial vehicle hyperspectral remote sensing: Analysis of RF, ANN, and SVM regression models. Remote Sens. 2017, 9, 309. [Google Scholar] [CrossRef] [Green Version]
  45. Fassnacht, F.; Hartig, F.; Latifi, H.; Berger, C.; Hernández, J.; Corvalán, P.; Koch, B. Importance of sample size, data type and prediction method for remote sensing-based estimations of aboveground forest biomass. Remote Sens. Environ. 2014, 154, 102–114. [Google Scholar] [CrossRef]
  46. Seyednasrollah, B.; Young, A.M.; Hufkens, K.; Milliman, T.; Friedl, M.A.; Frolking, S.; Richardson, A.D. Tracking vegetation phenology across diverse biomes using Version 2.0 of the PhenoCam Dataset. Sci. Data 2019, 6, 1–11. [Google Scholar]
  47. Zhang, X.; Jayavelu, S.; Liu, L.; Friedl, M.A.; Henebry, G.M.; Liu, Y.; Schaaf, C.B.; Richardson, A.D.; Gray, J. Evaluation of land surface phenology from VIIRS data using time series of PhenoCam imagery. Agric. For. Meteorol. 2018, 256, 137–149. [Google Scholar] [CrossRef]
  48. Klosterman, S.; Melaas, E.; Wang, J.A.; Martinez, A.; Frederick, S.; O’Keefe, J.; Orwig, D.A.; Wang, Z.; Sun, Q.; Schaaf, C. Fine-scale perspectives on landscape phenology from unmanned aerial vehicle (UAV) photography. Agric. For. Meteorol. 2018, 248, 397–407. [Google Scholar] [CrossRef]
  49. Xie, Y.; Civco, D.L.; Silander, J.A., Jr. Species-specific spring and autumn leaf phenology captured by time-lapse digital cameras. Ecosphere 2018, 9, e02089. [Google Scholar] [CrossRef] [Green Version]
  50. Klosterman, S.; Hufkens, K.; Gray, J.; Melaas, E.; Sonnentag, O.; Lavine, I.; Mitchell, L.; Norman, R.; Friedl, M.; Richardson, A. Evaluating remote sensing of deciduous forest phenology at multiple spatial scales using PhenoCam imagery. Biogeosciences 2014, 11, 4305–4320. [Google Scholar] [CrossRef] [Green Version]
  51. Richardson, A.D.; Jenkins, J.P.; Braswell, B.H.; Hollinger, D.Y.; Ollinger, S.V.; Smith, M.-L. Use of digital webcam images to track spring green-up in a deciduous broadleaf forest. Oecologia 2007, 152, 323–334. [Google Scholar] [CrossRef]
  52. Alberton, B.; Torres, R.D.S.; Cancian, L.F.; Borges, B.D.; Almeida, J.; Mariano, G.C.; dos Santos, J.; Morellato, L.P.C. Introducing digital cameras to monitor plant phenology in the tropics: Applications for conservation. Perspect. Ecol. Conserv. 2017, 15, 82–90. [Google Scholar] [CrossRef]
  53. Liu, Y.; Hill, M.J.; Zhang, X.; Wang, Z.; Richardson, A.D.; Hufkens, K.; Filippa, G.; Baldocchi, D.D.; Ma, S.; Verfaillie, J. Using data from Landsat, MODIS, VIIRS and PhenoCams to monitor the phenology of California oak/grass savanna and open grassland across spatial scales. Agric. For. Meteorol. 2017, 237, 311–325. [Google Scholar] [CrossRef]
  54. Franch, B.; Vermote, E.F.; Skakun, S.; Roger, J.-C.; Becker-Reshef, I.; Murphy, E.; Justice, C. Remote sensing based yield monitoring: Application to winter wheat in United States and Ukraine. Int. J. Appl. Earth Obs. Geoinf. 2019, 76, 112–127. [Google Scholar] [CrossRef]
  55. Palm, C. Color texture classification by integrative co-occurrence matrices. Pattern Recognit. 2004, 37, 965–976. [Google Scholar] [CrossRef]
  56. Bhatta, B. Analysis of urban growth pattern using remote sensing and GIS: A case study of Kolkata, India. Int. J. Remote Sens. 2009, 30, 4733–4746. [Google Scholar] [CrossRef]
  57. Dey, V.; Zhang, Y.; Zhong, M. A Review on Image Segmentation Techniques with Remote Seneing Perspective. ISPRS TC VII Symposium–100 Years ISPRS, Vienna, Austria, July 5–7 2010. Volume XXXVIII. Part 7A. Available online: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.466.8028&rep=rep1&type=pdf (accessed on 15 September 2021).
  58. Lu, D. Aboveground biomass estimation using Landsat TM data in the Brazilian Amazon. Int. J. Remote Sens. 2005, 26, 2509–2525. [Google Scholar] [CrossRef]
  59. Yang, K.; Gong, Y.; Fang, S.; Duan, B.; Yuan, N.; Peng, Y.; Wu, X.; Zhu, R. Combining Spectral and Texture Features of UAV Images for the Remote Estimation of Rice LAI throughout the Entire Growing Season. Remote Sens. 2021, 13, 3001. [Google Scholar] [CrossRef]
  60. Peña-Barragán, J.M.; Ngugi, M.K.; Plant, R.E.; Six, J. Object-based crop identification using multiple vegetation indices, textural features and crop phenology. Remote Sens. Environ. 2011, 115, 1301–1316. [Google Scholar] [CrossRef]
  61. Liu, Y.; Wang, E.; Yang, X.; Wang, J. Contributions of climatic and crop varietal changes to crop production in the North China Plain, since 1980s. Glob. Chang. Biol. 2010, 16, 2287–2299. [Google Scholar] [CrossRef]
  62. Wu, D.; Yu, Q.; Lu, C.; Hengsdijk, H. Quantifying production potentials of winter wheat in the North China Plain. Eur. J. Agron. 2006, 24, 226–235. [Google Scholar] [CrossRef]
  63. Guo, Y.; Yin, G.; Sun, H.; Wang, H.; Chen, S.; Senthilnath, J.; Wang, J.; Fu, Y. Scaling effects on chlorophyll content estimations with RGB camera mounted on a UAV platform using machine-learning methods. Sensors 2020, 20, 5130. [Google Scholar] [CrossRef]
  64. Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
  65. Aasen, H.; Kirchgessner, N.; Walter, A.; Liebisch, F. PhenoCams for field phenotyping: Using very high temporal resolution digital repeated photography to investigate interactions of growth, phenology, and harvest traits. Front. Plant Sci. 2020, 11, 593. [Google Scholar] [CrossRef]
  66. Sonnentag, O.; Hufkens, K.; Teshera-Sterne, C.; Young, A.M.; Friedl, M.; Braswell, B.H.; Milliman, T.; O’Keefe, J.; Richardson, A.D. Digital repeat photography for phenological research in forest ecosystems. Agric. For. Meteorol. 2012, 152, 159–177. [Google Scholar] [CrossRef]
  67. Luo, Y.; El-Madany, T.S.; Filippa, G.; Ma, X.; Ahrens, B.; Carrara, A.; Gonzalez-Cascon, R.; Cremonese, E.; Galvagno, M.; Hammer, T.W. Using near-infrared-enabled digital repeat photography to track structural and physiological phenology in Mediterranean tree–grass ecosystems. Remote Sens. 2018, 10, 1293. [Google Scholar] [CrossRef] [Green Version]
  68. Liu, Y.; Wu, C.; Sonnentag, O.; Desai, A.R.; Wang, J. Using the red chromatic coordinate to characterize the phenology of forest canopy photosynthesis. Agric. For. Meteorol. 2020, 285, 107910. [Google Scholar] [CrossRef]
  69. Saberioon, M.; Amin, M.; Anuar, A.; Gholizadeh, A.; Wayayok, A.; Khairunniza-Bejo, S. Assessment of rice leaf chlorophyll content using visible bands at different growth stages at both the leaf and canopy scale. Int. J. Appl. Earth Obs. Geoinf. 2014, 32, 35–45. [Google Scholar] [CrossRef]
  70. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  71. Hall-Beyer, M. Practical guidelines for choosing GLCM textures to use in landscape classification tasks over a range of moderate spatial scales. Int. J. Remote Sens. 2017, 38, 1312–1338. [Google Scholar] [CrossRef]
  72. Gao, C.-C.; Hui, X.-W. GLCM-based texture feature extraction. Comput. Syst. Appl. 2010, 6, 048. [Google Scholar]
  73. Zhang, J.; Marszałek, M.; Lazebnik, S.; Schmid, C. Local features and kernels for classification of texture and object categories: A comprehensive study. Int. J. Comput. Vis. 2007, 73, 213–238. [Google Scholar] [CrossRef] [Green Version]
  74. Khatami, R.; Mountrakis, G.; Stehman, S.V. A meta-analysis of remote sensing research on supervised pixel-based land-cover image classification processes: General guidelines for practitioners and future research. Remote Sens. Environ. 2016, 177, 89–100. [Google Scholar] [CrossRef] [Green Version]
  75. Wang, F.; Yi, Q.; Hu, J.; Xie, L.; Yao, X.; Xu, T.; Zheng, J. Combining spectral and textural information in UAV hyperspectral images to estimate rice grain yield. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102397. [Google Scholar] [CrossRef]
  76. Haralick, R.M.; Shanmugam, K.; Dinstein, I. Textural Features for Image Classification. Stud. Media Commun. 1973, SMC-3, 610–621. [Google Scholar] [CrossRef] [Green Version]
  77. Fink, B.; Grammer, K.; Thornhill, R. Human (Homo sapiens) facial attractiveness in relation to skin texture and color. J. Comp. Psychol. 2001, 115, 92. [Google Scholar] [CrossRef] [Green Version]
  78. Sheha, M.A.; Mabrouk, M.S.; Sharawy, A. Automatic detection of melanoma skin cancer using texture analysis. Int. J. Comput. Appl. 2012, 42, 22–26. [Google Scholar]
  79. Zhou, X.; Geng, X.; Yin, G.; Hänninen, H.; Hao, F.; Zhang, X.; Fu, Y.H. Legacy effect of spring phenology on vegetation growth in temperate China. Agric. For. Meteorol. 2020, 281, 107845. [Google Scholar] [CrossRef]
  80. Cong, N.; Piao, S.; Chen, A.; Wang, X.; Lin, X.; Chen, S.; Han, S.; Zhou, G.; Zhang, X. Spring vegetation green-up date in China inferred from SPOT NDVI data: A multiple model analysis. Agric. For. Meteorol. 2012, 165, 104–113. [Google Scholar] [CrossRef]
  81. Atkinson, P.M.; Jeganathan, C.; Dash, J.; Atzberger, C. Inter-comparison of four models for smoothing satellite sensor time-series data to estimate vegetation phenology. Remote Sens. Environ. 2012, 123, 400–417. [Google Scholar] [CrossRef]
  82. Zeng, L.; Wardlow, B.D.; Xiang, D.; Hu, S.; Li, D. A review of vegetation phenological metrics extraction using time-series, multispectral satellite data. Remote Sens. Environ. 2020, 237, 111511. [Google Scholar] [CrossRef]
  83. Zhu, Y.; Zhang, Y.; Zu, J.; Wang, Z.; Huang, K.; Cong, N.; Tang, Z. Effects of data temporal resolution on phenology extractions from the alpine grasslands of the Tibetan Plateau. Ecol. Indic. 2019, 104, 365–377. [Google Scholar] [CrossRef]
  84. Li, N.; Zhan, P.; Pan, Y.; Zhu, X.; Li, M.; Zhang, D. Comparison of remote sensing time-series smoothing methods for grassland spring phenology extraction on the Qinghai–Tibetan Plateau. Remote Sens. 2020, 12, 3383. [Google Scholar] [CrossRef]
  85. Liu, Y.; Bachofen, C.; Wittwer, R.; Duarte, G.S.; Sun, Q.; Klaus, V.H.; Buchmann, N. Using PhenoCams to track crop phenology and explain the effects of different cropping systems on yield. Agric. Syst. 2022, 195, 103306. [Google Scholar] [CrossRef]
  86. Zheng, H.; Cheng, T.; Zhou, M.; Li, D.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Improved estimation of rice aboveground biomass combining textural and spectral analysis of UAV imagery. Precis. Agric. 2019, 20, 611–629. [Google Scholar] [CrossRef]
  87. Liu, Y.; Liu, S.; Li, J.; Guo, X.; Wang, S.; Lu, J. Estimating biomass of winter oilseed rape using vegetation indices and texture metrics derived from UAV multispectral images. Comput. Electron. Agric. 2019, 166, 105026. [Google Scholar] [CrossRef]
  88. Wang, D.; Fahad, S.; Saud, S.; Kamran, M.; Khan, A.; Khan, M.N.; Hammad, H.M.; Nasim, W. Morphological acclimation to agronomic manipulation in leaf dispersion and orientation to promote “Ideotype” breeding: Evidence from 3D visual modeling of “super” rice (Oryza sativa L.). Plant Physiol. Biochem. 2019, 135, 499–510. [Google Scholar] [CrossRef]
  89. Shaaf, S.; Bretani, G.; Biswas, A.; Fontana, I.M.; Rossini, L. Genetics of barley tiller and leaf development. J. Integr. Plant Biol. 2019, 61, 226–256. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  90. Wang, Y.; Wen, W.; Wu, S.; Wang, C.; Yu, Z.; Guo, X.; Zhao, C. Maize plant phenotyping: Comparing 3D laser scanning, multi-view stereo reconstruction, and 3D digitizing estimates. Remote Sens. 2019, 11, 63. [Google Scholar] [CrossRef] [Green Version]
  91. Liu, C.; Zhang, Q.; Tao, S.; Qi, J.; Ding, M.; Guan, Q.; Wu, B.; Zhang, M.; Nabil, M.; Tian, F. A new framework to map fine resolution cropping intensity across the globe: Algorithm, validation, and implication. Remote Sens. Environ. 2020, 251, 112095. [Google Scholar] [CrossRef]
  92. Benami, E.; Jin, Z.; Carter, M.R.; Ghosh, A.; Hijmans, R.J.; Hobbs, A.; Kenduiywo, B.; Lobell, D.B. Uniting remote sensing, crop modelling and economics for agricultural risk management. Nat. Rev. Earth Environ. 2021, 2, 140–159. [Google Scholar] [CrossRef]
  93. Tuchscherer, M.; Otten, W.; Kanitz, E.; Gräbner, M.; Tuchscherer, A.; Bellmann, O.; Rehfeldt, C.; Metges, C.C. Effects of inadequate maternal dietary protein: Carbohydrate ratios during pregnancy on offspring immunity in pigs. BMC Vet. Res. 2012, 8, 1–11. [Google Scholar] [CrossRef] [Green Version]
  94. Guo, Y.; Senthilnath, J.; Wu, W.; Zhang, X.; Zeng, Z.; Huang, H. Radiometric calibration for multispectral camera of different imaging conditions mounted on a UAV platform. Sustainability 2019, 11, 978. [Google Scholar] [CrossRef] [Green Version]
  95. Hassan, M.A.; Yang, M.; Rasheed, A.; Yang, G.; Reynolds, M.; Xia, X.; Xiao, Y.; He, Z. A rapid monitoring of NDVI across the wheat growth cycle for grain yield prediction using a multi-spectral UAV platform. Plant Sci. 2019, 282, 95–103. [Google Scholar] [CrossRef]
  96. Johansen, K.; Raharjo, T.; McCabe, M.F. Using multi-spectral UAV imagery to extract tree crop structural properties and assess pruning effects. Remote Sens. 2018, 10, 854. [Google Scholar] [CrossRef] [Green Version]
  97. Zaman-Allah, M.; Vergara, O.; Araus, J.; Tarekegne, A.; Magorokosho, C.; Zarco-Tejada, P.; Hornero, A.; Albà, A.H.; Das, B.; Craufurd, P. Unmanned aerial platform-based multi-spectral imaging for field phenotyping of maize. Plant Methods 2015, 11, 1–10. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  98. Mohammed, G.H.; Colombo, R.; Middleton, E.M.; Rascher, U.; van der Tol, C.; Nedbal, L.; Goulas, Y.; Pérez-Priego, O.; Damm, A.; Meroni, M. Remote sensing of solar-induced chlorophyll fluorescence (SIF) in vegetation: 50 years of progress. Remote Sens. Environ. 2019, 231, 111177. [Google Scholar] [CrossRef] [PubMed]
  99. Li, Z.; Zhang, Q.; Li, J.; Yang, X.; Wu, Y.; Zhang, Z.; Wang, S.; Wang, H.; Zhang, Y. Solar-induced chlorophyll fluorescence and its link to canopy photosynthesis in maize from continuous ground measurements. Remote Sens. Environ. 2020, 236, 111420. [Google Scholar] [CrossRef]
  100. Liu, L.; Yang, X.; Zhou, H.; Liu, S.; Zhou, L.; Li, X.; Yang, J.; Han, X.; Wu, J. Evaluating the utility of solar-induced chlorophyll fluorescence for drought monitoring by comparison with NDVI derived from wheat canopy. Sci. Total Environ. 2018, 625, 1208–1217. [Google Scholar] [CrossRef] [PubMed]
  101. Carlson, T.N.; Ripley, D.A. On the relation between NDVI, fractional vegetation cover, and leaf area index. Remote Sens. Environ. 1997, 62, 241–252. [Google Scholar] [CrossRef]
  102. Tucker, C.J.; Pinzon, J.E.; Brown, M.E.; Slayback, D.A.; Pak, E.W.; Mahoney, R.; Vermote, E.F.; El Saleous, N. An extended AVHRR 8-km NDVI dataset compatible with MODIS and SPOT vegetation NDVI data. Int. J. Remote Sens. 2005, 26, 4485–4498. [Google Scholar] [CrossRef]
  103. Matsushita, B.; Yang, W.; Chen, J.; Onda, Y.; Qiu, G. Sensitivity of the enhanced vegetation index (EVI) and normalized difference vegetation index (NDVI) to topographic effects: A case study in high-density cypress forest. Sensors 2007, 7, 2636–2651. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  104. Velazco, J.G.; Rodríguez-Álvarez, M.X.; Boer, M.P.; Jordan, D.R.; Eilers, P.H.; Malosetti, M.; Van Eeuwijk, F.A. Modelling spatial trends in sorghum breeding field trials using a two-dimensional P-spline mixed model. Theor. Appl. Genet. 2017, 130, 1375–1392. [Google Scholar] [CrossRef] [Green Version]
  105. Cendrero-Mateo, M.P.; Muller, O.; Albrecht, H.; Burkart, A.; Gatzke, S.; Janssen, B.; Keller, B.; Körber, N.; Kraska, T.; Matsubara, S. Field phenotyping: Concepts and examples to quantify dynamic plant traits across scales in the field. In Terrestrial Ecosystem Research Infrastructures; CRC Press: Boca Raton, FL, USA, 2017; pp. 53–81. [Google Scholar]
  106. Xie, C.; Yang, C. A review on plant high-throughput phenotyping traits using UAV-based sensors. Comput. Electron. Agric. 2020, 178, 105731. [Google Scholar] [CrossRef]
  107. Tattaris, M.; Reynolds, M.P.; Chapman, S.C. A direct comparison of remote sensing approaches for high-throughput phenotyping in plant breeding. Front. Plant Sci. 2016, 7, 1131. [Google Scholar] [CrossRef]
  108. Hu, P.; Chapman, S.C.; Wang, X.; Potgieter, A.; Duan, T.; Jordan, D.; Guo, Y.; Zheng, B. Estimation of plant height using a high throughput phenotyping platform based on unmanned aerial vehicle and self-calibration: Example for sorghum breeding. Eur. J. Agron. 2018, 95, 24–32. [Google Scholar] [CrossRef]
  109. Grüner, E.; Wachendorf, M.; Astor, T. The potential of UAV-borne spectral and textural information for predicting aboveground biomass and N fixation in legume-grass mixtures. PLoS ONE 2020, 15, e0234703. [Google Scholar] [CrossRef]
  110. Park, J.Y.; Muller-Landau, H.C.; Lichstein, J.W.; Rifai, S.W.; Dandois, J.P.; Bohlman, S.A. Quantifying leaf phenology of individual trees and species in a tropical forest using unmanned aerial vehicle (UAV) images. Remote Sens. 2019, 11, 1534. [Google Scholar] [CrossRef] [Green Version]
  111. Berra, E.F.; Gaulton, R.; Barr, S. Assessing spring phenology of a temperate woodland: A multiscale comparison of ground, unmanned aerial vehicle and Landsat satellite observations. Remote Sens. Environ. 2019, 223, 229–242. [Google Scholar] [CrossRef]
  112. Yang, Q.; Shi, L.; Han, J.; Yu, J.; Huang, K. A near real-time deep learning approach for detecting rice phenology based on UAV images. Agric. For. Meteorol. 2020, 287, 107938. [Google Scholar] [CrossRef]
  113. Thapa, S.; Garcia Millan, V.E.; Eklundh, L. Assessing Forest Phenology: A Multi-Scale Comparison of Near-Surface (UAV, Spectral Reflectance Sensor, PhenoCam) and Satellite (MODIS, Sentinel-2) Remote Sensing. Remote Sens. 2021, 13, 1597. [Google Scholar] [CrossRef]
  114. Li, X.; Fu, Y.H.; Chen, S.; Xiao, J.; Yin, G.; Li, X.; Zhang, X.; Geng, X.; Wu, Z.; Zhou, X. Increasing importance of precipitation in spring phenology with decreasing latitudes in subtropical forest area in China. Agric. For. Meteorol. 2021, 304, 108427. [Google Scholar] [CrossRef]
  115. Czernecki, B.; Nowosad, J.; Jabłońska, K. Machine learning modeling of plant phenology based on coupling satellite and gridded meteorological dataset. Int. J. Biometeorol. 2018, 62, 1297–1309. [Google Scholar] [CrossRef] [Green Version]
  116. Almeida, J.; dos Santos, J.A.; Alberton, B.; Torres, R.d.S.; Morellato, L.P.C. Applying machine learning based on multiscale classifiers to detect remote phenology patterns in cerrado savanna trees. Ecol. Inform. 2014, 23, 49–61. [Google Scholar] [CrossRef]
  117. Xin, Q.; Li, J.; Li, Z.; Li, Y.; Zhou, X. Evaluations and comparisons of rule-based and machine-learning-based methods to retrieve satellite-based vegetation phenology using MODIS and USA National Phenology Network data. Int. J. Appl. Earth Obs. Geoinf. 2020, 93, 102189. [Google Scholar] [CrossRef]
  118. Holloway, P.; Kudenko, D.; Bell, J.R. Dynamic selection of environmental variables to improve the prediction of aphid phenology: A machine learning approach. Ecol. Indic. 2018, 88, 512–521. [Google Scholar] [CrossRef] [Green Version]
  119. Iandola, F.N.; Han, S.; Moskewicz, M.W.; Ashraf, K.; Dally, W.J.; Keutzer, K. SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and<0.5 MB model size. arXiv 2016, arXiv:1602.07360. [Google Scholar]
  120. Zhong, Z.; Jin, L.; Xie, Z. High performance offline handwritten chinese character recognition using googlenet and directional feature maps. In Proceedings of the 2015 13th International Conference on Document Analysis and Recognition (ICDAR), Tunis, Tunisia, 23–26 August 2015; pp. 846–850. [Google Scholar]
  121. Hoheneder, T.J. Evaluation of a Low-Cost UAS and Phenocams for Measuring Grapevine Greenness. Ph.D. Thesis, West Virginia University, Morgantown, WV, USA, 2021. [Google Scholar]
  122. Zeng, L.; Peng, G.; Meng, R.; Man, J.; Li, W.; Xu, B.; Lv, Z.; Sun, R. Wheat Yield Prediction Based on Unmanned Aerial Vehicles-Collected Red–Green–Blue Imagery. Remote Sens. 2021, 13, 2937. [Google Scholar] [CrossRef]
  123. Burke, M.W.; Rundquist, B.C. Scaling Phenocam GCC, NDVI, and EVI2 with Harmonized Landsat-Sentinel using Gaussian Processes. Agric. For. Meteorol. 2021, 300, 108316. [Google Scholar] [CrossRef]
Figure 1. The geographic location of the experimental site and the region of interest (ROI) selection for data analysis. (a) Geographic location of the experimental sites; light-blue shaded area represents the North China Plain (NCP). (b) View of the phenocam for Nanpi in 2020. (c,d) View of the phenocam for Shangqiu in 2018 and 2019, and 2020, respectively. Note: the red, blue and yellow boxes in (bd) indicate the ROI for the calculation of the indices.
Figure 1. The geographic location of the experimental site and the region of interest (ROI) selection for data analysis. (a) Geographic location of the experimental sites; light-blue shaded area represents the North China Plain (NCP). (b) View of the phenocam for Nanpi in 2020. (c,d) View of the phenocam for Shangqiu in 2018 and 2019, and 2020, respectively. Note: the red, blue and yellow boxes in (bd) indicate the ROI for the calculation of the indices.
Remotesensing 14 00244 g001
Figure 2. The definition of maize phenology according to the temporal dynamic of the vegetation index. Note: The curves of the original and filtered GCC by DLF for the site Shangqiu in 2018 are the orange and blues solid lines in (a). The green, purple, red and yellow dash lines in (a) are in accordance with (be), representing different maize phenological aspects, respectively.
Figure 2. The definition of maize phenology according to the temporal dynamic of the vegetation index. Note: The curves of the original and filtered GCC by DLF for the site Shangqiu in 2018 are the orange and blues solid lines in (a). The green, purple, red and yellow dash lines in (a) are in accordance with (be), representing different maize phenological aspects, respectively.
Remotesensing 14 00244 g002
Figure 3. The average values, average values between the 25th and 75th percentiles and average of the 90th percentile of spectral and textural indices of each day. Note: the presented temporal changes in spectral indices are based on phenocam data obtained at Shangqiu in 2018. Note: (ah) each represented indices calculated using GCC, RCC, RGRI, RGBVI, Contrast, Correlation, Energy, and Homogeneity, respectively.
Figure 3. The average values, average values between the 25th and 75th percentiles and average of the 90th percentile of spectral and textural indices of each day. Note: the presented temporal changes in spectral indices are based on phenocam data obtained at Shangqiu in 2018. Note: (ah) each represented indices calculated using GCC, RCC, RGRI, RGBVI, Contrast, Correlation, Energy, and Homogeneity, respectively.
Remotesensing 14 00244 g003
Figure 4. The comparison of original and filtered indices using DLF, HANTS, SG and SI. Note: the presented temporal changes in spectral indices and textural indices are based on Phenocam data obtained at Shangqiu in 2018. Note: (a) represented the original indices, and (be) represented the original indices filtered by DLF, HANTS, SG, and SI, respectively.
Figure 4. The comparison of original and filtered indices using DLF, HANTS, SG and SI. Note: the presented temporal changes in spectral indices and textural indices are based on Phenocam data obtained at Shangqiu in 2018. Note: (a) represented the original indices, and (be) represented the original indices filtered by DLF, HANTS, SG, and SI, respectively.
Remotesensing 14 00244 g004
Figure 5. The comparison of extracted and observed phenology from ground observations using different filtering methods. Note: (ad) represented the comparison of extracted phenology using DLF, HANTS, SG, and SI, respectively.
Figure 5. The comparison of extracted and observed phenology from ground observations using different filtering methods. Note: (ad) represented the comparison of extracted phenology using DLF, HANTS, SG, and SI, respectively.
Remotesensing 14 00244 g005
Figure 6. Comparison of extracted and observed phenology from ground observation using different filtering methods. Note: sq and np in the figures each represent the results for the sites in Shangqiu (sq) and Nanpi (np), respectively.
Figure 6. Comparison of extracted and observed phenology from ground observation using different filtering methods. Note: sq and np in the figures each represent the results for the sites in Shangqiu (sq) and Nanpi (np), respectively.
Remotesensing 14 00244 g006
Figure 7. Comparison of extracted and observed phenology from ground observation of maize using different indices. (a,c) represented the R2 and RMSE using different indices for all sites and years, (b,d) represented the R2 and RMSE using different indices for different sites and years, respectively.
Figure 7. Comparison of extracted and observed phenology from ground observation of maize using different indices. (a,c) represented the R2 and RMSE using different indices for all sites and years, (b,d) represented the R2 and RMSE using different indices for different sites and years, respectively.
Remotesensing 14 00244 g007
Table 1. Observed maize phenology in day-of-year (DOY) format at the sites in Nanpi and Shangqiu, respectively.
Table 1. Observed maize phenology in day-of-year (DOY) format at the sites in Nanpi and Shangqiu, respectively.
SitesYearSeedingSix LeavesBootingHeadingMaturity
Shangqiu2018164177188207255
Shangqiu2019167180191210258
Shangqiu2020169182193212260
Nanpi2020175188199218266
Table 2. Definition of spectral indices used in this study.
Table 2. Definition of spectral indices used in this study.
Spectral IndicesFormulationReference
Green chromatic coordinate (GCC)GCC = G/(R + G + B)[66,67]
Red chromatic coordinate (RCC) RCC = R/(R + G + B)[68]
Red–green ratio index (RGRI)RGRI = R/G[69]
Red–green–blue vegetation index (RGBVI)RGBVI = (G × G − B × R)/(G × G + B × R)[70]
Table 3. Definition of textural indices used in this study.
Table 3. Definition of textural indices used in this study.
Textural IndicesFormulaValue
Contrast i , j | i j | 2 p ( i , j ) Ranging from 0 to the square of the gray level minus one
Correlation Correlation = i , j ( i μ i ) ( j μ j )   p ( i , j ) σ i σ j Ranging from −1 to 1
Energy i , j p ( i , j ) 2 Ranging from 0 to 1
Homogeneity i , j   p ( i , j ) 1 + | i j | Ranging from 0 to 1
Note:i and j are the length and width of the image; p is the number of gray-level co-occurrence matrices in GLCM.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Guo, Y.; Chen, S.; Fu, Y.H.; Xiao, Y.; Wu, W.; Wang, H.; Beurs, K.d. Comparison of Multi-Methods for Identifying Maize Phenology Using PhenoCams. Remote Sens. 2022, 14, 244. https://doi.org/10.3390/rs14020244

AMA Style

Guo Y, Chen S, Fu YH, Xiao Y, Wu W, Wang H, Beurs Kd. Comparison of Multi-Methods for Identifying Maize Phenology Using PhenoCams. Remote Sensing. 2022; 14(2):244. https://doi.org/10.3390/rs14020244

Chicago/Turabian Style

Guo, Yahui, Shouzhi Chen, Yongshuo H. Fu, Yi Xiao, Wenxiang Wu, Hanxi Wang, and Kirsten de Beurs. 2022. "Comparison of Multi-Methods for Identifying Maize Phenology Using PhenoCams" Remote Sensing 14, no. 2: 244. https://doi.org/10.3390/rs14020244

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop