Next Article in Journal
Quality-Aware Autonomous Navigation with Dynamic Path Cost for Vision-Based Mapping toward Drone Landing
Previous Article in Journal
Spatial Variability of Albedo and Net Radiation at Local Scale Using UAV Equipped with Radiation Sensors
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Effect of the Red-Edge Band from Drone Altum Multispectral Camera in Mapping the Canopy Cover of Winter Wheat, Chickweed, and Hairy Buttercup

Department of Agricultural and Environmental Sciences, College of Agriculture, Tennessee State University, Nashville, TN 37209, USA
*
Author to whom correspondence should be addressed.
Drones 2023, 7(4), 277; https://doi.org/10.3390/drones7040277
Submission received: 13 March 2023 / Revised: 11 April 2023 / Accepted: 17 April 2023 / Published: 19 April 2023

Abstract

:
The detection and mapping of winter wheat and the canopy cover of associated weeds, such as chickweed and hairy buttercup, are essential for crop and weed management. With emerging drone technologies, the use of a multispectral camera with the red-edge band, such as Altum, is commonly used for crop and weed mapping. However, little is understood about the contribution of the red-edge band in mapping. The aim of this study was to examine the addition of the red-edge band from a drone with an Altum multispectral camera in improving the detection and mapping of the canopy cover of winter wheat, chickweed, and hairy buttercup. The canopy cover of winter wheat, chickweed, and hairy buttercup were classified and mapped with the red-edge band inclusively and exclusively using a random forest classification algorithm. Results showed that the addition of the red-edge band increased the overall mapping accuracy of about 7%. Furthermore, the red-edge wavelength was found to better detect winter wheat relative to chickweed and hairy buttercup. This study demonstrated the usefulness of the red-edge band in improving the detection and mapping of winter wheat and associated weeds (chickweed and hairy buttercup) in agricultural fields.

1. Introduction

The mapping of row crops such as winter wheat and associated weed (chickweed and hairy buttercup) canopy cover is generally important for crop and weed management. Crop canopy cover maps can be used to monitor performance traits such as growth, greenness, and productivity [1,2]. Furthermore, accurate mapping of the canopy cover of crops such as winter wheat can determine planting area, landscape pattern and planting structure for crop yield estimation [2,3]. For example, canopy cover maps can be used to estimate the spatial extent of crops within agricultural fields, and this could support yield estimation. Additionally, the crop landscape distribution pattern can be easily constructed from canopy cover maps based on spatial proximity analysis. Moreover, plant structural systems such as stems, branches, leaves, and flowers can also be easily detected and monitored from crop canopy cover maps. Therefore, the accurate mapping of the canopy cover of crops such as winter wheat is beneficial to enhance monitoring, yield estimation, and management.
In contrast, weed canopy cover maps in agricultural winter wheat production are essential to support the site-specific application of herbicides [4,5]. Weed control is necessary to improve grain yields in winter wheat production. This is because weed density, canopy cover and biomass negatively affect grain yield through competition for resources such as nutrients [6]. For example, crop yield losses have been found to increase with rising weed density, canopy cover, and biomass [7,8]. The accurate mapping of the canopy cover of weeds is therefore important to support site-specific weed control.
The detection and mapping of the canopy cover of winter wheat and associated weeds such as chickweed and hairy buttercup in agricultural field plots, generally require high spatial resolution remotely sensed data such as the use of drones. With emerging drone technologies, the use of multispectral cameras, such as Altum with the red-edge band, is becoming common for crops and weed mapping applications [9,10]. The red-edge band is spectrally located between the red (630–690 nm) and Near-Infrared (NIR) (770–890 nm) part of the electromagnetic spectrum, with a band range of roughly 670–780 nm [11].
The chlorophyll content of winter wheat and associated weed canopy cover has been found to influence the reflectance from the red-edge band [12,13,14]. This is because of the sensitivity of the red-edge band to vegetation chlorophyll [13]. Furthermore, there is also variability in the spectral signatures for winter wheat, chickweed, and hairy buttercup, with a change in the reflectance values of the red-edge band (Figure 1). The change in the reflectance values could help in the classification of winter wheat, chickweed, and hairy buttercup. Therefore, there is a need to explore the contribution of the red-edge wavelength in the detection and mapping of the canopy cover of winter wheat and associated weeds. The inclusion of the red-edge wavelength in the classification could improve the detection and mapping accuracy of the canopy cover of winter wheat, chickweed, and hairy buttercup.
Several recent studies have explored the potential of the red-edge band for improving the mapping of agricultural land cover and land use types to enhance crops and weed management [15,16,17,18]. For example, Sun et al. [15] explored the integration of the red-edge wavelength for the detection and mapping of crop types that included, but was not limited to, corn and cotton. They utilized the random forest classifier for integrating the red-edge band with other optical bands of Sentinel-2 for mapping agricultural crops. They found that the inclusion of the red-edge wavelength improved the overall accuracy of crop type mapping by 1.84% compared to only using traditional optical bands, i.e., blue (450 to 520 nm), green (520 to 590 nm), red (630 to 690 nm), and near-infrared (770–890 nm) [19,20]. Likewise, Luo et al. [18] also found that the addition of the red-edge band from Sentinel-2 improved crop classification’s overall accuracy. Similarly, Forkuor et al. [17] explored the addition of the red-edge band from Sentinel-2 satellite data for mapping land cover agricultural crops such as cereals and legumes. They utilized a combination of classification algorithms such as random forest, support vector machine and stochastic gradient boosting for mapping crops with the red-edge band, inclusively and exclusively. They found that the addition of the red-edge band improved the overall classification accuracy for land cover agricultural crops by about 1.8%, 3.8%, and 4.2% using random forest, support vector machine, and stochastic gradient boosting, respectively.
Although a few studies have examined the red-edge band from satellite platforms such as Sentinel-2 in mapping land cover agricultural crops, very limited studies have explored the red-edge wavelength from drones with multispectral cameras for mapping the canopy cover of crops and weeds. This study aims to examine the addition of the red-edge band from the drone with an Altum multispectral camera in improving the detection and mapping of the canopy cover of winter wheat, chickweed, and hairy buttercup. It will detect, map, and examine the overall accuracy of the winter wheat crop and associated weed canopy cover generated from the drone with an Altum multispectral camera with red-edge band, inclusively and exclusively.

2. Materials and Methods

2.1. Study Area

This study was carried out in a 4047 m2 plot located at Tennessee State University’s (TSU) campus farmland in Davidson County, Tennessee. The area ranges from latitude 36°10′33″ to 36°10′35″ N and longitude 86°49′35″ to 86°49′38″ W (Figure 2).

2.1.1. Climate

A modest climate with cool winters and warm summers are commonly experienced in the study area. The average annual temperature is approximately 78 °F (26 °C) in the summer and around 41 °F (5 °C) in the winter. The annual precipitation is around 51 inches (1300 mm) and is usually disseminated homogeneously throughout the seasons [21]. The month of May usually experiences the maximum monthly mean precipitation of approximately 5.51 inches, whereas the month of October experiences the minimum monthly mean rainfall of about 3.03 inches [22].

2.1.2. Soil and Topography

The soil type of the study area is predominately Byler silt loam and is moderately acidic. It consists of very deep, well-drained soils. The Byler silt loam soil has a moderate medium subangular blocky structure, friable, common fine roots, and a few distinct brown clay films on the ped faces. The soil is formed in silty alluvium and underlying clayey residuum of limestone. The depth to bedrock is greater than 60 inches, and the thickness of the silty mantle over clayey limestone residuum ranges from about 3 to 6 feet [23]. The area consists of very gentle terrain with a 2 to 5 percent slope [21].

2.2. Methodology

The methodological approach mainly involved the mapping of the canopy cover of winter wheat and associated weeds (chickweed and hairy buttercup) using a drone with an Altum multispectral camera with a red-edge band, inclusively and exclusively (Figure 3). Accuracy assessments were conducted on the classified maps of winter wheat, chickweed and hairy buttercup using ground truthing field datasets (Figure 3).

2.2.1. Growing of Winter Wheat

The plots used for the growing of winter wheat were identified and prepared first for the planting of soybean. Soybean was planted in the summer of 2021 and was followed by winter wheat in the fall of 2021. Soil samples were collected from the plots for the analysis of soil moisture (gravimetric method), soil particle size analysis (soil texture), soil pH, and nutrient recommendations for the winter wheat. Roundup Ready soybean variety Croplan® by Winfield United was planted on 21 July 2021. The initial weeds were controlled by spraying a non-selective herbicide, a 2 percent Roundup® herbicide in August of 2021. The soybean was then harvested on 22 October 2021. On 28 October 2021, a burn-down application using Roundup® herbicide was used to kill existing weeds prior to the planting of the winter wheat. On 9 November 2021, the winter wheat variety AgriPro SY Viper was planted using a no-till planter/drill in selected no-till plots. The plot size was 6 m by 20 m. The winter wheat seeds were planted in a south to north direction. The wheat variety was selected based on performance trials and recommendation from agronomists at the University of Tennessee Knoxville, Tennessee. During the tillering stage of the winter wheat, about 22 kg of nitrogen was applied to the winter wheat plots. The monitoring of weed establishment during the tillering, jointing, boot, and heading/flowering stages was conducted. The heading/flowering stage was selected to examine the effect of the red-edge band for detection and mapping of weed canopy cover. The weed spectrum at this stage (heading/flowering) was relatively high. The winter wheat was killed (harvested) on 1 June 2022 by mowing it down with a rotary mower. Therefore, the winter wheat was not harvested for grain yield; instead it was harvested for wheat straw.

2.2.2. Drone Data

An Inspire 2 drone with an Altum multispectral camera onboard was flown over the winter wheat field plot on 10 May 2022. The drone images were captured during the heading/flowering growth stage of the winter wheat, with dominant weeds of chickweed and hairy buttercup. The drone was flown at 15 m altitude and with a speed of 3 m/s using Pix4D capture auto-pilot mode. The drone with Altum multispectral camera system captures images in the six spectral bands (Table 1). It is comprised of a blue band from 450 to 520 nm, green band from 520 to 590 nm, red band from 630 to 690 nm, red-edge band from 690 to 730 nm, near-infrared (IR) band from 770–890 nm, and longwave infrared thermal (LWIR) band from 10,600 to 11,200 nm [19,24,25].
The drone with an Altum multispectral camera was triggered to take an image every 2 s, and more than five thousand images with spatial resolution of about 1cm were acquired. The camera was connected to a Global Positioning System (GPS) data and a Downwelling Light Sensor (DLS). The GPS data was made available to the camera to allow the Altum multispectral camera system to properly geotag images. The Downwelling Light Sensor is a 5-band incident light sensor that measures the ambient light during a flight for each of the camera’s five spectral bands, i.e., blue (450 to 520 nm), green (520 to 590 nm), red (630 to 690 nm), red-edge (690 to 730 nm), and near-infrared (770–890 nm). The information from the TIFF images captured by the camera is recorded to the metadata. This information can be used to correct global lighting changes during flights, such as clouds covering the sun. The captured drone images were pre-processed by mosaicking, radiometric and geometric correction, and subsetting. The drone images were geotagged and automatically mosaicked in Pix4D mapper, version 4.5, Pix4D Inc. (Denver, CO, USA). The radiometric correction was performed by camera and sun irradiance using calibrated reflectance panel images, which were acquired before and after each flight to generate reflectance drone images. Furthermore, although the mosaicked image captured was geotagged with geographic coordinates, a geometric correction was still performed on the mosaicked image. The geometric correction was conducted using ground control points to properly register the coordinates on the image, with features on the ground such as plot boundaries. This was carried out using greater than 50 ground control points derived from the field data of plot boundaries, ranging poles and a root mean square (RMS) value of lower than 1 pixel. The mosaicked drone imagery was subset to the 4047 m2 winter wheat field plot study area.
Supervised classification was performed on the drone spectral bands to classify winter wheat, chickweed, and hairy buttercup with the red-edge band, inclusively and exclusively. Three hundred and sixty (360) polygons consisting of 120 winter wheat, 120 chickweed and 120 hairy buttercup were digitized around ranging poles that were inserted into the ground in the location of each crop. In addition to ranging poles, weed flowers were clearly visible in the mosaicked image and helped with the digitizing efforts. Out of the 360 polygons, 60 polygons were used as training data that represented 20 for winter wheat, 20 for chickweed, and 20 for hairy buttercup in the supervised classification. The remaining 300 polygons were kept aside and overlaid to the classification maps for validation and accuracy assessments.
The supervised classification was performed using a machine learning random forest classification algorithm [26]. This algorithm was selected because it has been found to outperform other machine learning classification algorithms, such as support vector machine for crop and weed detection and mapping [27,28,29]. It is an ensemble classification algorithm that produces multiple decision trees using a randomly selected subset of training samples and variables [30]. It is expressed using Equation (1) [26].
{ D T ( x , θ k ) }   k     = T 1  
where x is the input vector and θk represents a random vector, which is sampled independently but with the same distribution as the previous θk, …, θk−1. T bootstrap samples are initially derived from the training data. A no-pruned classification and regression tree (CART) is drawn from each bootstrap sample β, where only one of M randomly selected feature is chosen for the split at each node of CART [31]. Each of the decision tree casts a unit vote for the most popular class to classify an input vector [32]. The number of features used at each node to generate a tree, and the number of trees to be grown, are two user-defined parameters that are required to generate a random forest classifier.
The number of trees and training samples in the random forest classification prediction model were selected through a resampling-based procedure to search for optimal tuning parameters. The optimal settings were selected based on the mean overall accuracy across 5-fold cross validation, repeated twice [33,34]. The default number of training samples was set at 5000 and the number of trees was set at 10.
The generated canopy cover maps of winter wheat, chickweed, and hairy buttercup were validated to examine how well the classified maps represented winter wheat, chickweed, and hairy buttercup on the ground. The validation effort was performed by overlaying the three hundred (300) digitized polygons that represented ground truth data kept aside for validation to each classified map. The 300 polygons used in validation were different from the 60 polygons used as training data. The overall classification accuracy was computed for each classified map by dividing the total correct (i.e., the sum of the major diagonal in the error matrix table) by the total number of pixels in the error matrix table [35]. The kappa coefficient was also measured as described by [35].

3. Results

The canopy cover of winter wheat and associated weeds (chickweed and hairy buttercup) were successfully mapped with the red-edge band, inclusively and exclusively (Figure 4 and Figure 5). The canopy cover of winter wheat, chickweed, and hairy buttercup were consistent in distribution throughout the agricultural field in both maps generated with the red-edge band, inclusively and exclusively. For example, the canopy cover of hairy buttercup was predominantly found in the central and northern parts of the agricultural field plot in both maps generated with the red-edge band, inclusively and exclusively. In contrast, the canopy cover of chickweed was largely found in the northwestern parts of the field in both maps generated with the red-edge band, inclusively and exclusively (Figure 4 and Figure 5).
In the classification and mapping of the canopy cover of winter wheat, chickweed, and hairy buttercup with the red-edge band exclusively, the overall mapping accuracy was about 71% and with kappa statistics of around 0.58 (Table 2). Furthermore, the user’s accuracy that measures how well the classified canopy cover of winter wheat, chickweed, and hairy buttercup on the map represented each crop on the ground was maximum (88%) for chickweed and minimum (58%) for winter wheat. In addition, the producer’s accuracy that measures the ability of the random forest classification algorithm to detect winter wheat, chickweed, and hairy buttercup was highest (97%) for winter wheat and lowest (45%) for chickweed when the red-edge band was exclusive in the classification (Table 2).
In contrast, when the classification and mapping of the canopy cover of winter wheat, chickweed, and hairy buttercup was generated with the red-edge band inclusively, the overall mapping accuracy increased by about 7% (Table 3). Likewise, the kappa statistics increased by approximately 0.09 (Table 3).
Furthermore, chickweed had the highest user accuracy of around 91%, whereas winter wheat had the lowest user accuracy of approximately 68% in the classification with the red-edge band, inclusively (Table 3). The producer’s accuracy was highest (94%) for winter wheat and lowest (58%) for chickweed (Table 3).

4. Discussion

The canopy cover of weeds has been associated with weeds’ patchiness, greenness, and distribution pattern. For example, Rasmussen et al. [36] found that the higher the weed coverage, the higher the average Normalized Difference Vegetation Index (NDVI) values and the lower the weeds’ patchiness. Furthermore, they found sections of the field with high average weed coverage to be associated with less concentrated weed distributions; whereas, section of the field with low average weed coverage had more concentrated weed distributions. This implies the north and western parts of the agricultural field plot that predominantly had chickweed and hairy buttercup, likely had higher NDVI values and lower weed patchiness relative to the south and eastern parts of the field. Likewise, the north and western portions of the field possibly had less concentrated weed distributions relative to the south and eastern sections.
The addition of the red-edge wavelength from the drone with an Altum multispectral camera was found to be beneficial to the detection and classification of the canopy cover of winter wheat, chickweed, and hairy buttercup. Other studies have also found the red-edge band useful in the detection and mapping of agricultural crops [37,38]. With the addition of the red-edge band, winter wheat had the highest increase in user accuracy of about 10%, whereas chickweed had the lowest increase in user accuracy of around 3%. This implies the addition of the red-edge band was better at detecting the canopy cover of winter wheat relative to chickweed and hairy buttercup. This is possibly because winter wheat was the main crop with a large vegetative cover. Furthermore, the drone images were captured during the heading/flowering growth stage of winter wheat, and that likely shaded the canopy cover of chickweed and hairy buttercup. In addition, the similar green-on-green color, shape and texture of winter wheat, chickweed, and hairy buttercup likely affected their detection and mapping [39,40,41]. Nonetheless, the addition of the red-edge wavelength from the drone with an Altum multispectral camera improved the overall classification and mapping of winter wheat, chickweed, and hairy buttercup.
The improved detection of weeds (chickweed and hairy buttercup) as demonstrated in this study has implications for herbicides application. It has the potential to significantly minimize herbicide input into the environment, and consequently reduce environmental and climate impacts [42,43]. A reduction in the spraying volume of herbicide application will potentially reduce the cost for weed management [11].
The detection and discrimination of winter wheat and associated weeds canopy cover is especially useful when using the drone platform relative to satellite data. This is because real-time data can be easily acquired using drones relative to satellites. Furthermore, drone data are generally acquired with high spatial resolution relative to satellite data. However, there are some limitations with the adoption of site-specific weed management monitoring systems such as drone use. These include, but are not limited to, the spatial and temporal resolution levels of weed detection. The spatial resolution is expected to influence the treatment of sub-fields, large weed patches, and individual plants [36]. This implies the improved mapping accuracies of chickweed and hairy buttercup derived in this study will likely change if the spatial resolution of the drone dataset changes. Furthermore, the mapping accuracies will possibly change if other platforms such as satellite datasets are used in the classification process. In addition, because weed infestations are frequently inconsistent over time [44,45], the winter wheat, chickweed, and hairy buttercup discrimination and detection accuracies will likely change during the growing season. Therefore, regular drone data acquisition is essential to support weed management for sustainable agricultural production. However, adverse weather conditions can be a limiting factor for regular drone flights in agricultural fields. In addition, high resolution drone datasets generally require a high processing time for weed mapping and analysis [46]. These could be limiting factors, especially for large-scale farming production. Extensive skills in drone image processing and licensing requirements could also be limiting factors for the adoption of a site-specific weed management drone monitoring system by producers.
The improved winter wheat, chickweed, and hairy buttercup detection map generated by the addition of the red-edge band could contribute to the estimation of field yield losses in winter wheat production. This is because small grain yield and yield components have been found to be negatively correlated with weed biomass [47]. However, crop yield losses related to weed density varies substantially [48]. This is possibly due to different times of weed emergence and differences in the soil and microclimatic conditions [47,49].

5. Conclusions

The contribution of the red-edge band from a drone with an Altum multispectral camera has been explored in this study, for the detection and mapping of the canopy cover of winter wheat, chickweed, and hairy buttercup in agricultural field plots. The red-edge band was found to be beneficial in improving the classification and mapping of the canopy cover of winter wheat, chickweed, and hairy buttercup. It improved the overall classification and mapping accuracy of the canopy cover of winter wheat, chickweed, and hairy buttercup by about 7% relative to classification with the red-edge band, exclusively. Furthermore, the addition of the red-edge wavelength from the drone with an Altum multispectral camera also improved the average user’s mapping accuracy of winter wheat, chickweed, and hairy buttercup by about 3%. However, the improved detection and mapping accuracies of the canopy cover of winter wheat, chickweed and hairy buttercup will likely change because of the spatial and temporal resolution levels of the crop/weed detection. Nonetheless, with emerging drone technologies, it is important to utilize multispectral cameras with the red-edge band, especially for winter wheat crops and weed detection. This is because the inclusion of the red-edge wavelength in winter wheat and weed classification is expected to improve mapping accuracy, and consequently enhance monitoring and management. Further study will explore the addition of red-edge vegetation-derived indices from drones with an Altum multispectral camera in the mapping of winter wheat and associated weed canopy cover. In addition, understanding weed dynamics during the growth stages of winter wheat is also an area for further research. Besides, examining deep learning relative to machine learning classification approaches for the detection and mapping of winter wheat and associated weed canopy cover is also an area to be explored.

Author Contributions

Conceptualization, C.E.A.; methodology, C.E.A.; software, C.E.A.; validation, C.E.A.; formal analysis, C.E.A.; investigation, C.E.A.; resources, C.E.A.; data curation, C.E.A.; writing-original draft preparation, C.E.A. and S.D.; writing-review and editing, C.E.A. and S.D.; visualization, C.E.A.; supervision, C.E.A.; project administration, C.E.A.; funding acquisition, C.E.A. and S.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by United States Department of Agriculture (USDA)-National Institute of Food and Agriculture (NIFA) through the Agriculture and Food Research Initiative (AFRI) Small and Medium-Sized Farms program, grant number 2021-69006-33875.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to restrictions e.g., privacy or ethical.

Acknowledgments

Many thanks to the College of Agriculture, TSU for providing the necessary logistics and support in performing the research objectives. Our gratitude to Eddie Williams (Farm Manager) and his technicians for all farm management-related support.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhao, C.J. Advances of research and application in remote sensing for agriculture. Trans. Chin. Soc. Agric. Mach. 2014, 45, 277–293. [Google Scholar]
  2. Kang, Y.; Meng, Q.; Liu, M.; Zou, Y.; Wang, X. Crop Classification Based on Red-Edge Features Analysis of GF-6 WFV Data. Sensors 2021, 21, s21134328. [Google Scholar] [CrossRef] [PubMed]
  3. Chen, Z.X.; Ren, J.Q.; Tang, H.J.; Shi, Y.; Liu, J. Progress and perspectives on agricultural remote sensing research and applications in China. J. Remote Sens. 2016, 20, 748–767. [Google Scholar]
  4. Hunter III, J.E.; Gannon, T.W.; Richardson, R.J.; Yelverton, F.H.; Leon, R.G. Integration of remote-weed mapping and an autonomous spraying unmanned aerial vehicle for site-specific weed management. Pest Manag. Sci. 2020, 76, 1386–1392. [Google Scholar] [CrossRef] [PubMed]
  5. Lopez-Granados, F. Weed detection for site-specific weed management: Mapping and real-time approaches. Int. J. Weed Biol. Ecol. Veg. Manag. 2011, 51, 1–11. [Google Scholar] [CrossRef]
  6. Flessner, M.L.; Burke, I.C.; Dille, J.A.; Everman, W.J.; VanGessel, M.J.; Tidemann, B.; Manuchehri, M.R.; Soltani, N.; Sikkema, P.H. Potential wheat yield loss due to weeds in the United States and Canada. Weed Technol. 2021, 35, 916–923. [Google Scholar] [CrossRef]
  7. Wilson, B.J.; Wright, K.J.; Brain, P.; Clements, M.; Stephens, E. Predicting the competitive effects of weed and crop density on weed biomass, weed seed production and crop yield in wheat. Int. J. Weed Biol. Ecol. Veg. Manag. 1995, 35, 265–278. [Google Scholar] [CrossRef]
  8. Adeux, G.; Vieren, E.; Carlesi, S.; Bàrberi, P.; Munier-Jolain, N.; Cordeau, S. Mitigating crop yield losses through weed diversity. Nat. Sustain. 2019, 2, 1018–1026. [Google Scholar] [CrossRef]
  9. Singh, S.; Pandey, P.; Khan, M.S.; Semwal, M. Multi-temporal High Resolution Unmanned Aerial Vehicle (UAV) Multispectral Imaging for Menthol Mint Crop Monitoring. In Proceedings of the 2021 6th International Conference for Convergence in Technology (I2CT), Maharashtra, India, 2–4 April 2021; pp. 1–4. [Google Scholar]
  10. Brewer, K.; Clulow, A.; Sibanda, M.; Gokool, S.; Naiken, V.; Mabhaudhi, T. Predicting the Chlorophyll Content of Maize over Phenotyping as a Proxy for Crop Health in Smallholder Farming Systems. Remote Sens. 2022, 14, 518. [Google Scholar] [CrossRef]
  11. Bilodeau, M.F.; Esau, T.J.; MacEachern, C.B.; Farooque, A.A.; White, S.N.; Zaman, Q.U. Identifying hair fescue in wild blueberry fields using drone images for precise application of granular herbicide. Smart Agric. Technol. 2023, 3, 100127. [Google Scholar] [CrossRef]
  12. Clevers, J.G.P.W.; Gitelson, A.A. Remote estimation of crop and grass chlorophyll and nitrogen content using red-edge bands on Sentinel-2 and -3. Int. J. Appl. Earth Obs. Geoinf. 2013, 23, 344–351. [Google Scholar] [CrossRef]
  13. Weichelt, H.; Rosso, R.; Marx, A.; Reigber, S.; Douglass, K.; Heynen, M. The RapidEye Red-Edge Band-White Paper. Available online: https://apollomapping.com/wp-content/user_uploads/2012/07/RapidEye-Red-Edge-White-Paper.pdf (accessed on 3 November 2022).
  14. Xianju, L.; Gang, C.; Jingyi, L.; Weitao, C.; Xinwen, C.; Yiwei, L. Effects of RapidEye Imagery’s Red-edge Band and Vegetation Indices on Land Cover Classification in an Arid Region. Chin. Geogr. Sci. 2017, 27, 827–835. [Google Scholar]
  15. Sun, L.; Chen, J.; Guo, S.; Deng, X.; Han, Y. Integration of Time Series Sentinel-1 and Sentinel-2 Imagery for Crop Type Mapping over Oasis Agricultural Areas. Remote Sens. 2020, 12, 158. [Google Scholar] [CrossRef]
  16. Recio, J.A.; Helmholz, P.; Muller, S. Potential Evaluation of Different Types of Images and Their Combination for the Classification of GIS Objects Cropland and Grassland; The International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences: Hannover, Germany, 2011; Volume XXXVIII-4/W19, p. 6. [Google Scholar]
  17. Forkuor, G.; Dimobe, K.; Serme, I.; Tondoh, J.E. Landsat-8 vs. Sentinel-2: Examining the added value of sentinel-2′s red-edge bands to land-use and land-cover mapping in Burkina Faso. GIScience Remote Sens. 2018, 55, 331–354. [Google Scholar] [CrossRef]
  18. Luo, C.; Liu, H.; Lu, L.; Liu, Z.; Kong, F.; Zhang, X. Monthly composites from Sentinel-1 and Sentinel-2 images for regional major crop mapping with Google Earth Engine. J. Integr. Agric. 2021, 20, 1944–1957. [Google Scholar] [CrossRef]
  19. Zahir, I.A.D.M.; Omar, A.F.; Jamlos, M.F.; Azmi, M.A.M.; Muncan, J. A review of visible and near-infrared (Vis-NIR) spectroscopy application in plant stress detection. Sens. Actuators A Phys. 2022, 338, 113468. [Google Scholar] [CrossRef]
  20. Meng, H.; Li, C.; Liu, Y.; Gong, Y.; He, W.; Zou, M. Corn Land Extraction Based on Integrating Optical and SAR Remote Sensing Images. Land 2023, 12, 398. [Google Scholar] [CrossRef]
  21. Hodges, J.A.; Norrell, R.J.; Sarah, M.H. Tennessee; Encyclopedia Britannica, Inc.: Chicago, IL, USA, 2018; Available online: https://www.britannica.com/place/Tennessee (accessed on 10 January 2023).
  22. United States Climate Data. Climate Nashville-Tennessee. Available online: https://www.usclimatedata.com/climate/nashville/tennessee/united-states/ustn0357 (accessed on 15 January 2023).
  23. USDA-NRCS. Byler Series; National Cooperative Soil Survey, United States Department of Agriculture—Natural Resources Conservation Service: Nashville, TN, USA, 2001. [Google Scholar]
  24. MicaSense. MicaSense Altum™ and DLS 2 Integration Guide; MicaSense, Inc.: Seattle, WA, USA, 2020. [Google Scholar]
  25. Agilandeeswari, L.; Prabukumar, M.; Radhesyam, V.; Phaneendra, K.L.N.B.; Farhan, A. Crop Classification for Agricultural Applications in Hyperspectral Remote Sensing Images. Appl. Sci. 2022, 12, 1670. [Google Scholar] [CrossRef]
  26. Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  27. Islam, N.; Rashid, M.M.; Wibowo, S.; Xu, C.; Morshed, A.; Wasimi, S.A.; Moore, S.; Rahman, S.M. Early Weed Detection Using Image Processing and Machine Learning Techniques in an Australian Chilli Farm. Agriculture 2021, 11, 387. [Google Scholar] [CrossRef]
  28. Khosravi, I.; Alavipanah, S.K. A random forest-based framework for crop mapping using temporal, spectral, textural and polarimetric observations. Int. J. Remote Sens. 2019, 40, 7221–7251. [Google Scholar] [CrossRef]
  29. Son, N.; Chen, C.; Chen, C.; Minh, V. Assessment of Sentinel-1A data for rice crop classification using random forests and support vector machines. Geocarto Intern. 2018, 33, 587–601. [Google Scholar] [CrossRef]
  30. Belgiu, M.; Dragut, L. Random forest in remote sensing: A review of applications and future directions. ISPRS J. Photogramm. Remote Sens. 2016, 114, 24–31. [Google Scholar] [CrossRef]
  31. Magidi, J.; Nhamo, L.; Mpandeli, S.; Mabhaudhi, T. Application of the Random Forest Classifier to Map Irrigated Areas Using Google Earth Engine. Remote Sens. 2021, 13, 876. [Google Scholar] [CrossRef]
  32. Breiman, L. Random Forests—Random Features; Technical Report 567; Statistics Department, University of California: Berkeley, CA, USA, 1999; Available online: ftp://ftp.stat.berkeley.edu/pub/users/breiman (accessed on 15 December 2022).
  33. Sharma, R.C.; Hara, K.; Hirayama, H. A Machine Learning and Cross-Validation Approach for the Discrimination of Vegetation Physiognomic Types Using Satellite Based Multispectral and Multitemporal Data. Scientifica 2017, 2017, 9806479. [Google Scholar] [CrossRef] [PubMed]
  34. Costa, H.; Almeida, D.; Vala, F.; Marcelino, F.; Caetano, M. Land Cover Mapping from Remotely Sensed and Auxiliary Data for Harmonized Official Statistics. Int. J. Geo-Inf. 2018, 7, 157. [Google Scholar] [CrossRef]
  35. Mather, P.M.; Koch, M. Computer Processing of Remotely-Sensed Images: An Introduction; John Wiley and Sons: Chichester, UK, 2011. [Google Scholar]
  36. Rasmussen, J.; Azim, S.; Nielsen, J. Pre-harvest weed mapping of Cirsium arvense L. based on free satellite imagery-The importance of weed aggregation and image resolution. Eur. J. Agron. 2021, 130, 126373. [Google Scholar] [CrossRef]
  37. Yi, Z.; Jia, L.; Chen, Q. Crop Classification Using Multi-Temporal Sentinel-2 Data in the Shiyang River Basin of China. Remote Sens. 2020, 12, 4052. [Google Scholar] [CrossRef]
  38. Shamsoddini, A.; Asadi, B. Crop mapping using Sentinel-1 and Sentinel-2 images and random forest algorithm. In Proceedings of the 4th Intercontinental Geoinformation Days (IGD), Tabriz, Iran, 20–21 June 2022; pp. 103–107. [Google Scholar]
  39. Hasan, A.S.M.M.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G.K. A survey of deep learning techniques for weed detection from images. Comput. Electron. Agric. 2021, 184, 106067. [Google Scholar] [CrossRef]
  40. Al-Badri, A.H.; Ismail, N.; Al-Dulaimi, K.; Ghalib Ahmed Salman, G.A.; Khan, A.R.; Al-Sabaawi, A.; Salam, M.S.H. Classification of weed using machine learning techniques: A review—Challenges, current and future potential techniques. J. Plant Dis. Prot. 2022, 129, 745–768. [Google Scholar] [CrossRef]
  41. Xu, B.; Meng, R.; Chen, G.; Liang, L.; Lv, Z.; Zhou, L.; Sun, R.; Zhao, F.; Yang, W. Improved Weed Mapping in Corn Fields by Combining Uav-Based Spectral, Textural, Structural, and Thermal Measurements. Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4255457 (accessed on 3 January 2023).
  42. Balasundram, S.K.; Shamshiri, R.R.; Sridhara, S.; Rizan, N. The Role of Digital Agriculture in Mitigating Climate Change and Ensuring Food Security: An Overview. Sustainability 2023, 15, 5325. [Google Scholar] [CrossRef]
  43. Balafoutis, A.; Beck, B.; Fountas, S.; Vangeyte, J.; van der Wal, T.; Soto, I.; Gómez-Barbero, M.; Barnes, A.; Eory, V. Precision Agriculture Technologies Positively Contributing to GHG Emissions Mitigation, Farm Productivity and Economics. Sustainability 2017, 9, 1339. [Google Scholar] [CrossRef]
  44. Duchene, O.; Bathellier, C.; Dumont, B.; David, C.; Celette, F. Weed community shifts during the aging of perennial intermediate wheatgrass crops harvested for grain in arable fields. Eur. J. Agron. 2023, 143, 126721. [Google Scholar] [CrossRef]
  45. Kordbacheh, F.; Flaten, D.N.; Gulden, R.H. Weed community dynamics under repeated fertilization with different nutrient sources over 5 years. Agric. Ecosyst. Environ. 2023, 346, 108328. [Google Scholar] [CrossRef]
  46. Fernandez-Quintanilla, C.; Pena, J.M.; Andujar, D.; Dorado, J.; Ribeiro, A.; Lopez-Granados, F. Is thecurrent state of the art of weed monitoring suitable for site-specific weed management in arable crops? Weed Res. 2018, 58, 259–272. [Google Scholar] [CrossRef]
  47. Marino, S. Understanding the spatio-temporal behavior of crop yield, yield components and weed pressure using time series Sentinel-2-data in an organic farming system. Eur. J. Agron. 2023, 145, 126785. [Google Scholar] [CrossRef]
  48. Oad, F.C.; Siddiqui, M.H.; Buriro, U.A. Growth and yield losses in wheat due to different weed densities. Asian J. Plant Sci. 2007, 6, 173–176. [Google Scholar] [CrossRef]
  49. Jack, O.; Menegat, A.; Gerhards, R. Winter wheat yield loss in response to Avena fatua competition and effect of reduced herbicide dose rates on seed production of this species. J. Plant Dis. Prot. 2017, 124, 371–382. [Google Scholar] [CrossRef]
Figure 1. Mean reflectance values of winter wheat, chickweed, and hairy buttercup from drone with multispectral Altum camera data. Error bars represented standard deviation of the mean.
Figure 1. Mean reflectance values of winter wheat, chickweed, and hairy buttercup from drone with multispectral Altum camera data. Error bars represented standard deviation of the mean.
Drones 07 00277 g001
Figure 2. Study area showing winter wheat plot at TSU campus farmland in Davidson County, Tennessee.
Figure 2. Study area showing winter wheat plot at TSU campus farmland in Davidson County, Tennessee.
Drones 07 00277 g002
Figure 3. A schematic representation of the methodology.
Figure 3. A schematic representation of the methodology.
Drones 07 00277 g003
Figure 4. The canopy cover map of winter wheat, chickweed, and hairy buttercup during the heading/flowering growth stage generated with red-edge band, exclusively.
Figure 4. The canopy cover map of winter wheat, chickweed, and hairy buttercup during the heading/flowering growth stage generated with red-edge band, exclusively.
Drones 07 00277 g004
Figure 5. The canopy cover map of winter wheat, chickweed, and hairy buttercup during the heading/flowering growth stage generated with red-edge band, inclusively.
Figure 5. The canopy cover map of winter wheat, chickweed, and hairy buttercup during the heading/flowering growth stage generated with red-edge band, inclusively.
Drones 07 00277 g005
Table 1. Spectral characteristics of drone with Altum multispectral camera.
Table 1. Spectral characteristics of drone with Altum multispectral camera.
Band NameCentre Wavelength (nm)Bandwidth (nm)
Blue47532
Green56027
Red66816
Red-Edge71712
Near Infrared84257
LWIR11,0006000
Table 2. Classification Error Matrix Table with Red-Edge Band Exclusively (Overall accuracy: 71%, Kappa statistics: 0.58).
Table 2. Classification Error Matrix Table with Red-Edge Band Exclusively (Overall accuracy: 71%, Kappa statistics: 0.58).
ClassesWinter WheatChickweedHairy ButtercupTotal
Reference
Winter Wheat904816155
Chickweed151658
Hairy Buttercup2127387
Total9311295300
User’s accuracy (%)Producer’s accuracy (%)
Winter Wheat5897
Chickweed8845
Hairy Buttercup8477
Table 3. Classification Error Matrix Table with Red-Edge Band Inclusively (Overall accuracy: 78%, Kappa statistics: 0.67).
Table 3. Classification Error Matrix Table with Red-Edge Band Inclusively (Overall accuracy: 78%, Kappa statistics: 0.67).
ClassesWinter WheatChickweedHairy ButtercupTotal
Reference
Winter Wheat90348132
Chickweed368475
Hairy Buttercup3157593
Total9611787300
User’s accuracy (%)Producer’s accuracy (%)
Winter Wheat6894
Chickweed9158
Hairy Buttercup8186
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Akumu, C.E.; Dennis, S. Effect of the Red-Edge Band from Drone Altum Multispectral Camera in Mapping the Canopy Cover of Winter Wheat, Chickweed, and Hairy Buttercup. Drones 2023, 7, 277. https://doi.org/10.3390/drones7040277

AMA Style

Akumu CE, Dennis S. Effect of the Red-Edge Band from Drone Altum Multispectral Camera in Mapping the Canopy Cover of Winter Wheat, Chickweed, and Hairy Buttercup. Drones. 2023; 7(4):277. https://doi.org/10.3390/drones7040277

Chicago/Turabian Style

Akumu, Clement E., and Sam Dennis. 2023. "Effect of the Red-Edge Band from Drone Altum Multispectral Camera in Mapping the Canopy Cover of Winter Wheat, Chickweed, and Hairy Buttercup" Drones 7, no. 4: 277. https://doi.org/10.3390/drones7040277

Article Metrics

Back to TopTop