Next Article in Journal
Diversification and Soil Management Effects on the Quality of Organic Apricots
Next Article in Special Issue
Stability and Adaptability of Maize Hybrids for Precision Crop Production in a Long-Term Field Experiment in Hungary
Previous Article in Journal
Genotype, Environment, Year, and Harvest Effects on Fruit Quality Traits of Five Blueberry (Vaccinium corymbosum L.) Cultivars
Previous Article in Special Issue
Effects of NP Fertilizer Placement Depth by Year Interaction on the Number of Maize (Zea mays L.) Plants after Emergence Using the Additive Main Effects and Multiplicative Interaction Model
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Vineyard Yield Estimation, Prediction, and Forecasting: A Systematic Literature Review

by
André Barriguinha
1,*,
Miguel de Castro Neto
1 and
Artur Gil
2,3
1
NOVA Information Management School (NOVA IMS), Campus de Campolide, Universidade Nova de Lisboa, 1070-312 Lisboa, Portugal
2
IVAR-Research Institute for Volcanology and Risks Assessment, University of the Azores, 9500-321 Ponta Delgada, Portugal
3
cE3c—Centre for Ecology, Evolution, and Environmental Changes & ABG—Azorean Biodiversity Group, Faculty of Sciences and Technology, University of the Azores, 9500-321 Ponta Delgada, Portugal
*
Author to whom correspondence should be addressed.
Agronomy 2021, 11(9), 1789; https://doi.org/10.3390/agronomy11091789
Submission received: 12 July 2021 / Revised: 25 August 2021 / Accepted: 31 August 2021 / Published: 7 September 2021
(This article belongs to the Special Issue Crop Yield Prediction in Precision Agriculture)

Abstract

:
Purpose—knowing in advance vineyard yield is a critical success factor so growers and winemakers can achieve the best balance between vegetative and reproductive growth. It is also essential for planning and regulatory purposes at the regional level. Estimation errors are mainly due to the high inter-annual and spatial variability and inadequate or poor performance sampling methods; therefore, improved applied methodologies are needed at different spatial scales. This paper aims to identify the alternatives to traditional estimation methods. Design/methodology/approach—this study consists of a systematic literature review of academic articles indexed on four databases collected based on multiple query strings conducted on title, abstract, and keywords. The articles were reviewed based on the research topic, methodology, data requirements, practical application, and scale using PRISMA as a guideline. Findings—the methodological approaches for yield estimation based on indirect methods are primarily applicable at a small scale and can provide better estimates than the traditional manual sampling. Nevertheless, most of these approaches are still in the research domain and lack practical applicability in real vineyards by the actual farmers. They mainly depend on computer vision and image processing algorithms, data-driven models based on vegetation indices and pollen data, and on relating climate, soil, vegetation, and crop management variables that can support dynamic crop simulation models. Research limitations—this work is based on academic articles published before June 2021. Therefore, scientific outputs published after this date are not included. Originality/value—this study contributes to perceiving the approaches for estimating vineyard yield and identifying research gaps for future developments, and supporting a future research agenda on this topic. To the best of the authors’ knowledge, it is the first systematic literature review fully dedicated to vineyard yield estimation, prediction, and forecasting methods.

1. Introduction

With yield being considered a quality grape and wine indicator [1,2,3,4,5], it is crucial to obtain an early estimation of the quantity of grapes per area unit. Knowing in advance vineyard yield is a key issue so that growers and winemakers can achieve the best balance between vegetative and reproductive growth, make more informed decisions as to thinning, irrigation, and nutrient management, schedule harvest, optimize winemaking operations, program crop insurance, and grape picking workforce demand, and help fraud detection [6,7,8].
The traditional methods [9] are considered destructive, labor-demanding, and time-consuming [4], with low accuracy [10] primarily due to operator errors [11] and sparse sampling (when compared to the inherent spatial variability in a production vineyard [5,12]). These are supported by manual sampling, where yield is estimated by sampling clusters weight and the number of clusters per vine, historical data, and extrapolation considering the number of vines in a plot. The main efforts towards improved yield models applied to the vineyard, considered one of the most complex phenotypic traits in viticulture [13], are in most cases focused on image analysis for grape detection at field level, with a significant drawback derived from cluster occlusion [14,15].
The growing adoption of Precision Agriculture (PA) practices, closely related with the ongoing advances in Geospatial Technologies (GT), Remote Sensing (RS), Proximal Sensing (PS), Internet of Things (IoT), Unmanned Aerial Vehicles (UAVs), Big Data Analytics (BDA) and Artificial Intelligence (AI) [16,17,18,19], are fueling the particular application in Precision Viticulture (PV) [20] where the importance of the wine industry drives the development of innovative methods and technologies to cope with the heterogeneity within vineyards that results from high inter-annual and spatial variability derived from the effects of soil and climate conditions, grapevine variety, biotic and abiotic stresses, vineyard management practices, among others [18,21]. However, despite being a hot topic in research over recent years, it still lacks solutions that can transfer the acquired knowledge and methods to the field and provide tools for wine-growers decision support.
Models based on statistically significant relationships between predictors and grapevine parameters are increasingly being overtaken by crop models that can dynamically simulate and integrate into different time frames, plant traits, and other variables regarding management, soil, and climate data [22]. This is particularly relevant, as grape production for wine is closely related to climate variables characterized in the past years by high inter-annual variability with direct adverse effects for wine producers that tend to be amplified by future climate changes’ perceived scenarios [23,24,25,26]. Nowadays, zoning the wine production areas, especially in denomination areas, is increasingly becoming more critical for the identification and characterization of homogenous areas that are the basis of regulatory measures over wine [8], to allow marketing strategies regarding controlled origins [27], and also regarding climate changes that require decisions at a regional level concerning adaptability of different varieties and mitigation management options in one of the most important crops in Europe [28]. PV must apply at the field level and at a larger scale, where the spatial variability may reveal general trends of variation not perceived at more minor scales [29].
The purpose of this present paper is three-fold: first, to perceive the research approaches for predicting yield in vineyards for wine production that can serve as an alternative to traditional estimation methods; second, to characterize the different new approaches identifying and comparing their applicability under field conditions, scalability concerning the objective, accuracy, advantages and shortcomings, and third, to identify research gaps for future developments and support a future research agenda on this topic. To achieve this goal, a systematic literature review was conducted using the PRISMA statement as a guideline [30].

2. Methodology

To identify the relevant scientific work already published on vineyard yield estimation, prediction, and forecasting, the authors carried out a systematic literature review of academic articles indexed on the Scopus, Web of Science, ScienceDirect, IEEE, MDPI, and PubMed databases, using the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) statement as a guideline [30]. Other databases such as Google Scholar and ResearchGate were not considered because a preliminary study undertaken by the authors showed that they would only contribute to a significant increase in duplicate articles.
Depending on the approach, the terminology behind knowing as far in advance as possible the quantity of grapes that will be harvested can be referred to as (1) estimation—when the goal is to find the most suitable parameter that best describes a multivariate distribution of a historical dataset, (2) as prediction—when a dataset is used to compute random values of the unseen data, and (3) as forecasting—when a temporal dimension in a prediction problem is explicitly added. In the present review, the authors adopted the broader term of yield estimation, although the other terms were considered keywords in the search criteria.
The authors adopted a search criteria query string conducted on the title, abstract, and keywords, using all the combinations of the following keywords: “yield” OR “production” AND “estimation” OR “prediction” OR “forecasting” AND “vineyard” OR “grapevine”. Only peer-reviewed journals, conference articles, and book chapters were considered for screening.
As the goal is to perceive alternatives to the traditional manual sampling method of determining in advance the vineyard yield, those were excluded from the final data set.
A total of 455 articles published between 1981 and 2021 were found. These articles were reviewed firstly based on title and abstract meeting the search criteria with the inclusion of the indicated keywords, resulting in 239 articles that were retrieved from the respective databases. Further reading resulted in the final 82 records included in the review that verify the research criteria for including scientific studies for vineyard yield estimation, prediction, and forecasting. (Figure 1).
The final record data set was categorized based on ten different methodological approaches identified for yield estimation in the screening phase that fall into a broader group of indirect estimation models derived mainly from dynamic or crop simulation models and data-driven models. Those were subdivided according to what can be considered more specific approaches: A—data-driven models based on computer vision and image processing; B—data-driven models based on vegetation indices; C—data-driven models based on pollen; D—crop simulation models; E—data-driven models based on trellis tension; F—data-driven models based on laser data processing; G—data-driven models based on radar data processing; H—data-driven models based on RF data processing; I—data-driven models based on ultrasonic signal processing; J—other data-driven models. Data regarding the year, journal distribution, data sources, test environment, applicability scale, and related variables used in estimation and accuracy were evaluated for each methodological approach. The abbreviations and acronyms used are listed on Table 1.

3. Results and Discussion

Looking at the scientific peer-reviewed journal distributions (Figure 2) it is interesting to see the vast scope of this topic in the researcher’s community with publications in 38 different journals, most of them with diverse subjects and scopes, ranging from agronomy to robotics, climate, and sensors. The top six cover 45% of the total papers published, with the remaining 39 (55%) published in 32 different journals.
For an overall perception of the ten different methodological approaches identified for yield estimation, they are represented in Figure 3, created with Circos [31].
On the right side of the semicircle, we can see the methodologies (from A to J), and on the left side, the years of the publications (from 1987 to 2021-not considering years in which there are no identified records). The included records are arranged circularly in segments and joined with scaled and colored thickness ribbons to relate the year of publication with the different methodological approaches quantitatively. The relationship between both appears in the inner circle. The thickness and the color represent the percentage of the relationship. Taking the year 2020 as an example, we can see that a universe of 18 records was included in the present review. From those, 11 (61% of the year 2020 records) are related to A (data-driven models based on image processing algorithms), representing 22% of the 50 records on data-driven models based on image processing algorithms. Visually we can see that since 2009, there has been a continuous production of articles on this topic, with an increasing interest in research since 2018 with a peak in 2020 (for 2021, the data only covers five months). Regarding methodological approaches, the focus of the researchers dealing with this complex topic is on data-driven models based on image processing algorithms (A) (61%), followed by data-driven models based on vegetation indices (B) (9%) and data-driven models based on pollen (C) (9%).
Crop yield estimation has a high degree of complexity. It involves, in most cases, the characterization of driving factors related to climate, plant, and crop management [32] that directly influence the number of clusters per vine, berries per cluster, and berry weight, as the three yield components [12], explaining 60%, 30% and 10% of the yield respectively [24,33].The different general methodological approaches used for vineyard yield estimation can be divided firstly regarding the scale (in-field level vs. regional level), and secondly, by direct (based on manual sampling) or indirect methods (statistical models, regression models, proximal/remote sensing, and dynamic or crop simulation models) that depend primarily on image identification and/or related climate, soil, vegetation, and crop management variables [32,34,35] that can also support crop simulation models, data-driven [23] and mechanistic growth models [36].
The standard or traditional methods retrieve limited data and produce a static prediction in a multi-step process of determining average number of clusters per vine, number of berries per cluster, and weight per cluster or berry with the growth overall 10% error greatly dependent on adequate staffing and extensive historical databases of cluster weights and yields [37]
Computer vision and image processing are leading the alternative methods and are one of the most utilized techniques for attempting an early yield estimation. Still, different approaches such as Synthetic Aperture Radar (SAR), low frequency ultrasound [38], RF Signals [39], counting number of flowers [40,41,42,43,44,45,46,47], Boolean model application [48], shoot count [49], shoot biomass [50,51], frequency-modulated continuous-wave (FMCW) radar [52,53], detection of specular spherical reflection peaks [54], the combination of RGB and multispectral imagery [55] along with derived occlusion ratios, are alternative methods.
Whatever the indirect method used, they all allow a fast and non-invasive alternative to manual sampling. They allow identifying single berries in images, even taken from a simple device such as a smartphone [56,57,58] and then using different methods such as convolutional neural networks [1,59], cellular automata [60], or even sensors capable of collecting phenotypic traits of grape bunches, that are known to be related with grapevine yield [14,61], to estimate yields.
Approaches such as non-productive canopy detection using green pixel thresholding in video frames, local thresholding and Self-Organizing-Maps on aerial imagery [62], light detection and ranging (LiDAR) for vineyard reconstruction [63], and map pruning wood [64] do not allow direct estimation of the yield but instead provide data layers to relate or use directly or as a correction coefficient in other methodologies, as they can show a relationship to yield.
Indices have been experiencing exponential growth in research related to productive and vegetative parameters in vineyards [65,66]. Derived from satellite imagery, UAVs [65,67], Unmanned Ground Vehicles (UGVs), or those mounted on tractors and Utility Terrain Vehicles (UTVs)[68], Normalized Difference Vegetation Index (NDVI)[11], Leaf Area Index (LAI) [5,68] and Water Index (WI) (with added importance in rainfed vineyards where water deficits play a significant role) [69], are predictors of spatial yield variability using passive and/or active sensors.
Other indirect methods include Bayesian growth models [70], weather-based models [71], models based on a combination of variables (meteorological, phenological and phytopathological) [6,72], dynamic crop models such as the “Simulateur mulTIdisciplinaire pour les Cultures Standard” (STICS) [73,74], crop biometric maps [75], and the continuous measurement of the tension in the horizontal (cordon) support wire of the trellis [37,76,77], also used to determine the best moment of hand sampling for yield estimation [78].
Predicting yield at a larger scale makes more sense now than ever as inter-annual variations attributed to climate change are entering a complex equation where quality, sustainability, efficiency, commercial and marketing strategies, regulations, insurances, stock management, and quotas are all related to yield forecasting [24]. However, at a regional level, there are few examples of yield forecasting. Those can be divided mainly into climate-based models estimating grape and wine production [23,79,80,81], pollen-based models [24,82,83,84], a combination of one or both with phenological and phytopathological variables [6,8], STICS models [74], and models based on correlations with indices such as NDVI, LAI, and NDWI [85].
Harvest estimation is a problem to which machine learning, computer vision, and image processing can be applied using one or a combination of techniques [86,87,88]. In proximal sensing methods, detection, segmentation, and counting of either individual grapes or bunches are complex in most image-based methodologies [38,59,89], especially in non-disturbed canopies where occlusion [15,90], illumination, colors, and contrast [91,92] are challenging and in most cases are only demonstrated conceptually at a small scale [89].
Along with Data Science, Artificial Intelligence, and Deep Learning, vineyard yield estimation can be applied at larger scales, not only through image analysis algorithms but also by identifying relevant predictive variables using data associated with climate, yield, phenology, fertilization, soil, maturation [23,40] and diseases [93], by making use of a growing number of remote sensing [85] and phenotyping platforms that allow quantitatively assessing plant traits in which yield falls [94,95].

3.1. A-Data-Driven Models Based on Computer Vision and Image Processing (n = 50)

Table 2 shows the summary of the records included in the systematic review regarding the use of computer vision and image processing techniques for yield estimation based on image, recorded mainly with still or mounted standard Red, Green, and Blue (RGB) and RGB-Depth Sensor (D) cameras, for the most under field conditions with a local application scale. The main goal is to extract variables from the images that can be related to the actual yield, such as the number of berries, bunch/cluster area, leaf area, number of flowers, stems, and branches. This can be accomplished with various computer vision, machine learning, and deep learning approaches.
From the retrieved results, we can say that computer vision and image processing are the most utilized techniques for attempting an early yield estimation alternatively to traditional sampling methods. The application of this type of methodology mimics for the most the manual sampling, removing the time-consuming and labor demanding tasks of collecting destructive samples from designated smart points that are weighted and used in extrapolation models adjusted with historical data and empirical knowledge from the viticulturist. The process can be divided into the actual data collection—preferably conducted under field conditions—and the interpretation of the data collected—analyzing the features collected—resulting in a yield estimation.
The images can be acquired using a still camera [4,10,96] in a laboratory or under field conditions, and also by other optical or multispectral proximal sensors, on-the-go using ATVs [12,40,97], other terrestrial autonomous vehicles [7,48] including autonomous robot systems [15,102,106], UAVs [101,105] that cope with the limitations of ground vehicles regarding field conditions (slopes and soil) or in a more simple way on foot with a smartphone [57].
Acquiring on-the-go without user intervention represents considerable expectable improvements regarding traditional methods, as it allows the limit to monitor the entire plot autonomously, creating estimation maps at earlier stages that can be updated regularly until harvest, permitting in some cases viticultural practices that can rectify key parameters and facilitate selective harvest [97]. Also, data collection can be made simultaneous with other agronomic operations, reducing acquisitions costs. The data collected can be used to determine multiple parameters directly correlated with yield and cultural practices assessment, vineyard status [10], and quality [96].
The more challenging aspect of the approach is to transform the data collected into an actual yield estimation. The more common approach is to identify automatically individual grapes or clusters for size determination e.g., [10,15,100,113] or other vine structures [115], along with 3D reconstruction [13,57,96,104,108,109,110] to estimate the actual yield. This requires for the most, in the model development phase, training and validation supported by manually assessing cluster weight and berry number per cluster after the image acquisition. The shortcoming related to the traditional approach is that the models are mostly variety dependent, and a commercial solution needs to cope with all the different varieties in a vineyard. According to Millan et al. [46], this can be resolved using a base model for identifying flower number per inflorescence that has theoretical potential to be variety-independent. However, according to the same author, the number of flowers per inflorescence alone is insufficient for correct yield estimation and needs to be combined with the fruit set rate and/or the average berry weight. The single variety-independent linear model is also referred to by Aquino et al. [42] but reported by Liu et al. [44] as not feasible unless a similarity in both structure and development stage occurs. Different authors, in fact, report flower number as an important explanatory variable for estimating yield [40,43,44] that can give a very early estimative, although not often used in traditional manual approaches as it tends to amplify the already referred-to shortcomings for cluster sampling.
Another aspect that needs to be pointed out is that a considerable part of the studies was made under laboratory conditions, and the results must be validated under field conditions that are typically very challenging. Also, the ones made “under field conditions” have in some cases more similarities with controlled environments with the vineyard adapted to the methodology and the purposed goal, e.g., counting berry number, instead of the other way around.
One major disadvantage is that 2-D or even stereo images do not bring measurement data in the depth of the scene [53], and image analysis algorithms are very dependent on occlusion [98,99], which can constitute self-occlusions: berries hidden behind berries within the same grape cluster, cluster-occlusions: berries hidden behind other grape clusters, and vine-occlusions: berries hidden behind the leaves and shoots of the vine [7]. Furthermore, environmental dynamics such as leaf movements due to wind and changing illumination conditions are challenging when working under field conditions [108]. This led some researchers to conduct image acquisition at night time [12], allowing them to isolate vines under evaluation from those in the adjacent row [97] (more relevant in more defoliated vineyards). Occlusion problems can also be in part resolved by detecting the specular reflection peaks from the spherical surface of the grapes from high-resolution images taken under artificial lighting at night [54], or by using a Boolean model to assess berry number that can estimate partially hidden berries from images collected on-the-go at 7 km/h [48].
Regarding yield explanatory variables, it is unclear which provide better accuracy, as the estimation errors presented vary in the same intervals for different variables. The accuracy seems to be more dependent on the methodological approach used for data collection and the robustness of the algorithms used to derive yield. Comparing the estimation to traditional methods with 0.58 < R2 < 0.75 [9], this approach can provide better but also worse results.
An issue pointed out by some authors [12] is that management practices (e.g., trellis, leaf-pulling, shoot/cluster thinning and shoot positioning) can directly impact data acquisition, mainly affecting the relation between what is measured and the predicted yield. It means that the choice of methodology must be aligned with the winegrower’s type of management.
One important answer to give is how early we can get an accurate yield estimation. Aquino et al. [97] and Palacios et al. [40] suggested that it is possible to accurately predict yield by monitoring vines at phenological stages between full flowering and cluster-closure (near four months preharvest at the earliest), taking into consideration that a global multi-varietal model requires training large datasets to be operationalized with success. Liu et al. [49] go further, using video images to detect shoots, allowing for a five months earlier yield estimation that also removes the necessity for prior training using an unsupervised feature selection algorithm combined with unsupervised learning. However, as the author points out, the approach relies heavily on an accurate estimate of the bunch to shoot ratio (time-consuming and prone to selection bias).
Although not often discussed, as all of the different approaches are conducted at a small scale the use of data-driven models based on computer vision and image processing at larger scales poses a problem regarding computational power [13,89], which must be addressed to cope with the same limitation already identified in traditional methods regarding poor sampling. Rose et al. [13] proposed a pipeline for yield parameter estimation using 3D data for future automated, high-throughput, large-data phenotyping tasks in the field.
From the list of methods in Table 2, none are referenced as being used by winegrowers under field conditions in commercial vineyards, even the ones that resulted in APPs, despite the potential still lack the knowledge transfer jump required to help winegrowers.

3.2. B-Data-Driven Models Based on Vegetation Indices (n = 7)

Table 3 shows the summary of the records included in the systematic review regarding the use of data-driven models based on vegetation indices. Remote and proximal sensing are used to measure plant reflected light in different portions of the spectrum, allowing the development of various vegetation indices that can provide useful information on plant structure and conditions [116] in a form of mathematical expressions that produces values regarding crop growth, vigor, and several other vegetation properties. There are 519 different indices reported in the Index Database [117]. The more recently used in agriculture for yield are listed by Sishodia [16] and reported as better indicators for full cover crops (e.g., horticulture and cereal) than for discontinuous crops (e.g., olives and vineyards) where, in addition to soil effects, the spectral measurement describes only a part of the canopy, mostly the top [65], although regarding soil the impact tends to be low as the vineyard critical growing stage (were indices/yield correlations tend to increase) occurs when cover crops are in most cases, senescent [5]. For vineyard yield estimation, the records found refer mainly to NDVI and LAI. [5,16].
Data sources vary mainly from a handheld or mounted spectroradiometer [87], multispectral cameras mounted on UAV [65], or satellite data [5]. Each has its own main advantages and disadvantages: spectroradiometers allow a finer sampling with less noise but also a sparser one, UAVs are more practical, fast, and deployed as needed allowing applicability on a medium scale without the disadvantages of satellite temporal, spatial resolution, and cloud coverage dependency, and satellites cover larger areas, and their data can be accessed and processed at low/no cost.
Using hyperspectral reflectance spectra, Maimaitiyiming et al. [87] propose an in-depth study to address the effects of irrigation levels and rootstocks on vine productivity. As part of the study, vine productivity, including fruit yield and ripeness parameters, were measured with 20 vegetation indices calculated and used as input for predictive model calibration. The berry yield and quality prediction models were developed with multiple linear regression (MLR), partial least squares regression (PLSR), random forest regression (RFR), weighted regularized extreme learning machine (WRELM) and a new activation function by fusing of a hyperbolic tangent (Tanh) function and rectified linear unit (ReLU) for WRELM (WRELM-TanhRe), demonstrating moderate to relatively strong correlations between berry yield and vegetation indices, namely water index (WI) (r = 0.67), modified triangular vegetation index (MTVI) (r = 0.64) and green normalized difference vegetation index (GNDVI) (r = 0.53). Regarding yield estimation, RFR outperformed the different models’ calibration (R2 = 0.86), while in the validation test, the WRELM-TanhRe model achieved the highest estimation accuracy (R2 = 0.62).
Indices as NDVI can also strengthen traditional manual sampling trough informed sampling strategies that may mitigate errors resulting from the within-field variability, improving yield estimation on average by 10% using NDVI data [11].
Using satellite data allows for regional scale estimation that can cover large areas. Gouveia et al. [80] developed multi-linear regression models of wine production, using NDVI and meteorological variables (monthly averages of maximum, minimum, and daily mean temperature and precipitation) as predictors to estimate yield in a 250,000 ha region with R2 = 0.62 for early season estimation and R2 = 0.90 for mid-season. A similar approach was made by Cunha et al. [85] with a Satellite Pour l’Observation de la Terre (SPOT) 10-day synthesis vegetation product (S10) for three different regions in Portugal with significant interannual variability, based on a correlation matrix between the wine yield of a current year and the full set of 10-day synthesis NDVI.
Although they recognized potential of NDVI, Matese et al. [65] argue that acquiring and analyzing spectral data, besides being costly (multispectral cameras), requires skills (“spectral know-how on radiometric correction and data analysis, primarily for filtering the canopy with low-temperature sensors resolution from common multispectral cameras”) not often available for all farmers. As an alternative, a model based on geometric data (canopy thickness and volume) retrieved with RGB sensors outperformed NDVI data. However, the authors’ statements can be debated, as low-cost NDVI cameras are becoming more available, namely Agrocam (https://www.agrocam.eu/ - accessed on 16 June 2021) and Mapir (https://www.mapir.camera - accessed on 16 June 2021) both with powerful and easy-to-use free cloud software included, although the data quality can be argued and requires validation and comparison with more recognized commercial multispectral alternatives that are pricier but also with more features, such as DJIMultispectral (https://www.dji.com/pt/p4-multispectral - accessed on 16 June 2021), Micasense (https://micasense.com - accessed on 16 June 2021) Parrot Sequoia+ (https://www.pix4d.com/product/sequoia - accessed on 16 June 2021) and Sentera (https://sentera.com/data-capture/6x-multispectral/ - accessed on 16 June 2021). Ballesteros et al. [86] used a hybrid approach combining NDVI (reflectance approach) with vegetated fraction cover as a measure of plant vigor (geometric approach), resulting in higher accuracy when compared to simple NDVI use with good results but requiring calibration for each season.
An important question is the time frame for data acquisition to give the best correlation day to estimate yield. Matese et al. [65] collected data during three seasons in the veraison phenological stage; Carrillo et al. [11] collected data before veraison; Ballesteros et al. [86] made UAV flights in several stages: fruit set, berry pea size, veraison, final berry ripening and after harvest. Maimaitiyiming et al. [87] collected data during the late veraison stage and the fruit ripening stage with the dates determined based on the number of no-rain days after irrigation treatment initiation (considering that the study was not focused only on yield estimation). For NDVI, Gouveia et al. [80] identified through comparing NDVI cycles and meteorological parameters for years of low and high wine production the significant differences during three stages: (1) from dormancy; (2) from budbreak and (3) starting with flowering and continuing during veraison, with a maximum at the end of spring and a minimum during winter for the selected vineyard area pixels, also indicating that good years for wine production reflect high photosynthetic activity during the previous autumn and spring followed by reduced greenness and reduced growth during summer (considering the Douro region in Portugal where the study was conducted). Sun et al. [5] found similar performance in NDVI and LAI regarding spatial yield variability, with peak correlations during the growing season that differed in different years. Maximum and seasonal-cumulative vegetation showed slightly lower correlations to yield. The authors state that the best time interval depends on the crop type, climate/weather conditions and management practices. Cunha et al. [85] used NDVI measurement 17 months before harvest with very good results in obtaining very early forecasts of potential regional wine yield (model explained 77−88% of the inter-annual variability in wine yield).
In line with what has already been mentioned for the data-driven models based on computer vision and image processing, this approach can provide better results on estimation yield. As pointed out by Sun et al. [5], performance is very dependent on environmental conditions and management strategies. For satellite data, spatial resolution can be the major bottleneck in smaller scales [85], along with less flexibility derived from temporal resolution and soil effect [86]. However, presently, there are alternatives such as Sentinel-2 with 12 spectral bands in 10–20 m spatial resolution, with global coverage and a five-day revisit frequency.

3.3. C-Data-Driven Models based on Pollen (n = 7)

Table 4 shows the summary of the records included in the systematic review regarding the use of data-driven models based on pollen. These models rely on the relationship between airborne pollen and yield [82]. The assumption is that there are more flowers per area unit in more productive years, thus higher airborne pollen concentrations [24].
Pollen monitoring and the determination of the pollen index (annual sum of the daily pollen concentrations in m3/year) was conducted by Cristofolini et al. [83] between the days when 5% and 95% of the seasons total pollen concentration were found (between 12 and 29 days per season), with very good results (R2 = 0.92). The combination of aerobiological, phenological, and meteorological data used by Gonzaléz et al. [84] and Fernandez et al. [6,8,72] also allowed an accurate production estimated more than one or two months in advance, with Fernandez et al. [72] achieving better results from a hirst trap (volumetric) for local predicting and with cour (passive) trap for regional yield predictions. Cunha [24] made a more comprehensive study to assess the model adaptability in fast expanding regions (regarding area and technology) with non-irrigated areas, with heavy water and thermal stress during summer. The study resulted in a regional forecast model to determine the potential yield at flowering through airborne pollen concentration and climate impact, applied to Alentejo in Portugal (one of the most arid wine regions of Europe). The determined regional pollen index (RPI) and fruit-set data as explanatory variables allowed a very good regional estimation (R2 = 0.86)
Choosing the best placement for sampling devices at the regional level representing effectively spatial variability, the number of observations needed for model calibration (usually years as historical data, as opposed for instance to weather data, is not commonly available), costly and complex laboratory processes, plant dynamics (e.g., high variations of the area with vineyards around the pollen traps) are the main disadvantages of using data-driven models based on pollen [24,85]. The number of pollen traps must be related to the area of influence and the availability of grapes or wine production at the relevant spatial scale [24]. Rainfall and temperature (primarily average and maximum) have an influence on pollen season, and so in pollen index values, typically higher temperature increases pollen concentration in the vineyard, and rainfall leads to less airborne pollen concentrations [72,83]. Also, fertilization during the flowering period can negatively decrease the airborne pollen concentrations [84]. For regional estimative, the models’ performance is linked with the different approaches on calculating RPI, and special care must be taken regarding the identification of the beginning and final of the pollen season to avoid pollen deposition, recirculation, and long-distance transport that does not contribute effectively to local pollination but increases RPI [24].
In line with what has already been mentioned above, these approaches can provide better results on estimation yield with application to local and regional scales.

3.4. D-Crop Simulation Models (n = 4)

Table 5 shows the summary of the records included in the systematic review regarding the use of crop simulation models. Crop models are important decision-support systems in agriculture [28] that allow the simulation through mathematical equations of plant development and the interaction with the environment by integrating phenotypic traits along with climate, soil, management decisions, and others variables considered to be related to yield estimation in this particular case. This approach is becoming more popular because it allows for virtual experiments that can be made in a specific phenological stage, testing hypotheses that could take years under real field conditions. Another advantage is the possibility of integrating decision support systems (DSS) [71,74].
The retrieved studies are complex and not limited to yield estimates, as they simulate grapevine growth and development. The models need to be appropriately calibrated and validated. That is one of the disadvantages of using this approach, as it needs to be adapted for new environments with distinct climate, soil, grapevine varieties, training systems and management. As such, complexity and cost in terms of time and biophysical data requirements become operationality and transferability very difficult [23].
The model developed by Cola et al. [71] achieved good results in a five-year validation assessment demonstrating flexibility and thrift regarding meteorological data. The approach used to simulate the fruit load was based on light interception derived gross assimilation and thermal and water limitations.
Sirsat et al. [23] focused on grape yield predictive models for flowering, coloring and harvest phenostages (due to lack of data regarding other phenostages, namely setting, berries pea-size and veraison) using machine learning techniques and climatic conditions, grapevine yield, phenological dates, fertilizer information, soil analysis and maturation index data to construct the relational dataset. The authors stated that meteorology data is the critical element for measuring the quantity of grapes, as the derived features of dew point, relative humidity, and air temperature were identified as the most favorable variables in constructing the model.
Some models such as the STICS have been used for different types of crops with good results: Fraga et al. [74] used it for three Portuguese native varieties. The application of this model requires thorough parameterization regarding yield components and historical phenological data computed by STICS using a concept called growing degree day (GDD). The results for simulating yield demonstrated a good capability of the model, with an overestimation in one of the regions studied and underestimation in the other. The authors pointed out a critical factor related to the duality between quality and yield, and the need for viticultural practices such as cluster thinning to be included in the model parametrization. The same model was used by Valdes et al. [73] in non-irrigated and irrigated vineyards in Chile and France, with similar results for yield estimation with an overestimation, that resulted from the underestimation of moderate water stress simulated by STICS after veraison.

3.5. E-Data-Driven Models Based on Trellis Tension (n = 4)

Table 6 summarizes the records included in the systematic review regarding using data-driven models based on trellis tension, all from the same author This approach is an indirect real-time method that uses sensors in the wires to measure the production in each vine row. The changes in tension are recorded by automated data systems connected to the load cells installed in-line. Each line needs calibration, the data must be corrected to remove the effects of temperature (using a 48 h moving average), and the effects of wind gust are negligible because measurements are not made in continuous periods [37]. The linear regression found in the studies demonstrates good results and estimation, with better results than the traditional manual sampling.
The trellis tension methodology can also be used to determine the timing for traditional hand sampling for yield estimation to determine the lag phase, thus eliminating the field scouting subjective visual and tactile assessments to assess whether berries are at lag phase [78].
Despite the better estimative that can be achieved and the ability to monitor near to real-time, the applicability of this method to commercial vineyards still needs to be evaluated regarding needed calibration for different vineyards and trellis systems, consistency across seasons, installation costs, number of sensors and spatial deployment [37].
The trellis tension monitor (TTM) is a spatial response to removing uniformly distributed fruit load of up to ~24 m or ~12 m to either side of the sensor. This means that eight to ten vines are a meaningful sample size [76].

3.6. F-Data-Driven Models Based on Laser Data Processing (n = 1)

Table 7 shows the summary of the records included in the systematic review regarding using data-driven models based on laser data processing with only one study identified. Vine canopy properties are a good indicator of quality and yield [64]. The application retrieved shows the potential of laser scanner technology to collect plant geometric characteristics with sufficient precision capable of being correlated with yield using a shoot sensor called Physiocap ®, designed and developed by the CIVC (Comité Interprofessionel du Vin de Champagne) that maps vigor spatial variability used during winter just before pruning [50]. In this study, the authors refer to the fact that at the scale of the Champagne (region in France where the study was conducted) vineyard, the aboveground biomass estimation was strongly correlated with the yield of the following year. The estimation results are good, but extreme climate events tend to lower the correlation found at a more local scale. Being the only study regarding this approach, and dependent on data from a single region that has been collected since 2011, applications to other regions must be evaluated.

3.7. G-Data-Driven Models Based on Radar Data Processing (n = 2)

Table 8 summarizes the records included in the systematic review regarding the use of data-driven models based on radar data processing, all from the same author. Three-dimensional radar imagery techniques for yield determination are reported here as an alternative to remote estimations based on proximal optical or multispectral proximal or remote sensors, to deal with limitations regarding performance, occlusion, and light issues in field conditions.
Henry et al. [52,53] used ground-based frequency-modulated continuous-wave radars operating at 24, 77, and 122 GHz to estimate grape mass without contact. The major advantage is that most grapes can be detected under field conditions even if leaves, shoots, or other grapes partially or fully hide them. As for limitations, the study only addressed yield estimation at the maturation phase for five different varieties.

3.8. H-Data-Driven Models Based on Radio Frequency Data Processing (n = 1)

Table 9 shows the summary of the records included in the systematic review regarding the use of data-driven models based on radio frequency data processing, with only one record retrieved. It relies on a new exploratory approach using a scheme that senses grape moisture content by utilizing Radio Frequency (RF) signals to estimate yield without physical contact in a laboratory environment. According to the authors, it can be used for early yield estimation [39].
This study represents an exploratory approach in a laboratory environment that does not provide an actual yield estimative. Therefore, its applicability to real world scenarios needs to be addressed. Nevertheless, it could be an alternative path for one of the main issues reported in data-driven models based on computer vision and image processing and occlusion.

3.9. I-Data-Driven Models Based on Ultrasonic Signal Processing (n = 1)

Table 10 summarizes the records included in the systematic review regarding the use of data-driven models based on ultrasonic signal processing, with only one record retrieved. Using low-frequency ultrasound is an alternative approach to detect grape clusters in the presence of foliage occlusion at a lower cost compared to alternatives such as Synthetic Aperture Radar (SAR) [38].
Despite not being a study to determine yield and being developed in a laboratory environment, the results are very interesting as they can provide an alternative for one of the main issues reported in data-driven models based on computer vision and image processing, which is occlusion.

3.10. J-Other Data-Driven Models (n = 6)

Table 11 summarizes of the records included in the systematic review that did not fall into one of the previous identified groups.
For regional level decision support, Fraga et al. [81] proposed a simple grape production model (PGP) based on favorable meteorological conditions. This model runs on a daily step, comparing the thermal/hydric conditions in a given year against the average conditions in high and low production years in three regional wineries, allowing one to perceive regional heterogeneity. The recognition of the importance of climate data for estimating yield at the regional level was also addressed by Santo et al. [79] with an empirical model, where temperature and precipitation averaged over the periods of February–March, May–June, and July–September, along with the anomalies of wine production in the previous five years, were used as predictors. At a local level, both climate and soil data were considered by Ubalde [34] as yield predictors, with Cation Exchange Capacity (CEC) and Winkler Index providing the best correlations with similar importance.
A different approach was made by Ellis [70], collecting bunch mass data during three seasons and using a Bayesian growth model, assuming the double sigmoidal curve that characterizes grape growth according to literature, to predict the yield at the end of those seasons. The author advocates using Bayesian methods due to the capability of systematically incorporating prior knowledge and updating the model with new data. The study is not very clear regarding yield estimation and does not indicate the accuracy.
By determining water status, leaf area (LA), and fruit load influence on berry weight (BW) and sugar accumulation, Santeesteban et al. [29] found that average leaf water potential in summer and LA/BN ratio, when considered together, estimated BW properly (R2 = 0.91), showing that under semiarid conditions, water availability plays the primary role in regulation of berry growth.

4. Conclusions

As an overall conclusion, the alternative methodologies for yield estimation mentioned in this paper can, as demonstrated by the revised articles, surpass the limitations assigned to traditional manual sampling methods with the same or better results on accuracy. They all have advantages and shortcomings, but more importantly, they still lack a fundamental key aspect: the real application in a commercial vineyard.
Despite extensive research in this area, adoption at an operational level to effectively substitute the manual sampling estimation is residual. Methods made available to winegrowers should estimate production as far in advance as possible and must be as simple as possible and with as little data as possible, preferably with data that producers can access quickly, easily, and cheaply and, if possible, without the need for intensive training or validation. The best approach must consider the availability and/or possibility to have the required inputs (required data is sometimes not available), the adequate spatial resolution (field level or regional level), the necessary granularity (information regarding the spatial variability in each area) and required precision (e.g., a simple smartphone camera, despite the loss in quality, can be in many cases a cost-effective alternative to hyper and multispectral cameras, LiDAR, ultrasonic and radar sensors).
The synergistic use of proximal and remote sensing with AI can be one of the best ways to model a vineyard production system. Still, due to its inherent complexity, it is a difficult challenge to apply because of the diversity of field conditions, as remote sensing data is dependent on spatial, temporal, and spectral resolution, and yield is correlated with an extensive list of climate, soil and plant variables that have high temporal and spatial heterogeneity. Also, the relation to quality is one of the biases that yield estimation needs to deal with, as the producer’s management decision directly impacts both quality and yield.
For local estimation at the farm level, data-driven models based on computer vision and image processing are the ones that the research community is putting more effort into, and can be classified as the easiest to be deployed by growers under real field conditions. Data acquisition can be made easily on-the-go with a vast array of solutions ranging from a simple smartphone to an autonomous robot platform, a UAV, or even agriculture equipment. Despite good results in estimating yield, these methods are not fully matured yet. Management practices (e.g., trellis, leaf-pulling, shoot/cluster thinning and shoot positioning) can directly impact data acquisition by affecting the relationship between what is measured and the predicted yield. There are still problems with occlusion; algorithms are generally variety dependent, and environmental dynamics are challenging. Data acquisition speed, computational processing constraints, and the availability of predictive yield maps as output should be addressed in commercial applications.
Vegetation indices are also a good alternative, as they can be easy to deploy and used at different scales with good results, especially NDVI. Data acquisition is generally feasible and affordable, but transforming data into usable information requires technical knowledge not often available for all farmers. The past limitations linked to the direct use of multispectral satellite remote sensing data, such as insufficient spatial resolution, inadequate temporal resolution, and complex data access and processing, were significantly overcome since the launch in mid-2015 of the EU Copernicus Program Sentinel-2 mission combined with the development of appropriate desktop and cloud-based data processing platforms (e.g., Google Earth Engine: https://earthengine.google.com/ (accessed on 16 June 2021) [118]; Sen2-Agri: http://www.esa-sen2agri.org/ (accessed on 16 June 2021) [119]; and Sen4CAP: http://esa-sen4cap.org/ (accessed on 16 June 2021) [120]). As for models based on computer vision and image processing, correspondent operational solutions are not yet available for growers as needed. Future commercial solutions can pass by including yield estimation algorithms in UAVs data management software or web platforms such as EO Browser (https://apps.sentinel-hub.com/eo-browser/ - accessed on 16 June 2021) or EOS Platform (https://crop-monitoring.eos.com/ - accessed on 16 June 2021), providing multispectral satellite data and derived products and indices, with required parametrization when needed.
Crop models were also referenced as one of the best alternatives for estimating yield. Still, few examples were identified, mainly because of the complexity of their development, especially hard in vineyards because of the inherent specificities and the required data for calibration in different locations and for different varieties.
There is also a lack of solutions for estimating yield at broader scales (e.g., regional level). The perception is that decisions are more likely to take place at a smaller scale, which in some cases is not accurate. It might be the case in regulated areas and areas where support for small viticulturists is needed and made by institutions with proper resources and a large area of influence. This is corroborated by the fact that data-driven models based on Trellis Tension and Pollen traps are being used for yield estimation at regional scales in real environments in different regions of the world.
Other more residual approaches like laser, radar, radio frequency and ultrasonic data can provide new alternatives to cope with some of the difficulties encountered especially in computer vision and image processing approaches.
Despite the use of remote and proximal sensing models with an inherent spatial component, predictive yield maps are scarcely referenced and used as an output of yield estimation models. New approaches such as GeoAI [121] are not yet referred to in the reviewed articles. As spatial variability and heterogeneity are some of the more critical parameters for decision-making in PV (the producer wants to know the quantity and where that quantity is), it is a relevant research gap that must be addressed appropriately.

Author Contributions

Conceptualization, A.B., M.d.C.N. and A.G.; methodology, A.B. and A.G.; formal analysis, A.B.; writing—original draft preparation, A.B.; writing—review and editing, A.B., M.d.C.N. and A.G.; supervision, M.d.C.N. and A.G. All authors have read and agreed to the published version of the manuscript.

Funding

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zabawa, L.; Kicherer, A.; Klingbeil, L.; Milioto, A.; Topfer, R.; Kuhlmann, H.; Roscher, R. Detection of single grapevine berries in images using fully convolutional neural networks. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, Long Beach, CA, USA, 16–17 June 2019; pp. 2571–2579. [Google Scholar]
  2. Santesteban, L.G.; Royo, J.B. Water status, leaf area and fruit load influence on berry weight and sugar accumulation of cv. ‘Tempranillo’ under semiarid conditions. Sci. Hortic. 2006, 109, 60–65. [Google Scholar] [CrossRef]
  3. De la Fuente Lloreda, M. The relevance of the yield prediction methods in vineyard management. Le Bulletin de l’OIV ISSN 0029-7121 2014, 87, 387–394. [Google Scholar]
  4. Diago, M.-P.; Tardaguila, J.; Aleixos, N.; Millan, B.; Prats Montalbán, J.; Cubero, S.; Blasco, J. Assessment Of Cluster Yield Components By Image Analysis. J. Sci. Food Agric. 2015, 95. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Sun, L.; Gao, F.; Anderson, M.C.; Kustas, W.P.; Alsina, M.M.; Sanchez, L.; Sams, B.; McKee, L.; Dulaney, W.; White, W.A.; et al. Daily mapping of 30 m LAI and NDVI for grape yield prediction in California vineyards. Remote Sens. 2017, 9, 317. [Google Scholar] [CrossRef] [Green Version]
  6. Fernández-González, M.; Escuredo, O.; Rodríguez-Rajo, F.J.; Aira, M.J.; Jato, V. Prediction of grape production by grapevine cultivar Godello in north-west Spain. J. Agric. Sci. 2011, 149, 725–736. [Google Scholar] [CrossRef]
  7. Nuske, S.; Gupta, K.; Narasimhan, S.; Singh, S. Modeling and calibrating visual yield estimates in vineyards. In Field and Service Robotics; Springer: Berlin/Heidelberg, Germany, 2014; pp. 343–356. [Google Scholar]
  8. Fernandez-Gonzalez, M.; Rodriguez-Rajo, F.J.; Jato, V.; Escuredo, O.; Aira, M.J. Estimation of yield ‘Loureira’ variety with an aerobiological and phenological model. Grana 2011, 50, 63–72. [Google Scholar] [CrossRef]
  9. De La Fuente, M.; Linares, R.; Baeza, P.; Miranda, C.; Lissarrague, J.R. Comparison of different methods of grapevine yield prediction in the time window between fruitset and veraison. OENO One 2015, 49, 27. [Google Scholar] [CrossRef]
  10. Tardaguila, J.; Diago, M.P.; Millan, B.; Blasco, J.; Cubero, S.; Aleixos, N. Applications of Computer Vision Techniques in Viticulture to Assess Canopy Features, Cluster Morphology and Berry Size. In I International Workshop on Vineyard Mechanization and Grape and Wine Quality; Poni, S., Ed.; Acta Horticulturae; International Society for Horticultural Science: Leuven, Belgium, 2013; Volume 978. [Google Scholar]
  11. Carrillo, E.; Matese, A.; Rousseau, J.; Tisseyre, B. Use of multi-spectral airborne imagery to improve yield sampling in viticulture. Precis. Agric. 2016, 17, 74–92. [Google Scholar] [CrossRef]
  12. Nuske, S.; Wilshusen, K.; Achar, S.; Yoder, L.; Narasimhan, S.; Singh, S. Automated Visual Yield Estimation in Vineyards. J. Field Robot. 2014, 31, 996. [Google Scholar] [CrossRef]
  13. Rose, J.C.; Kicherer, A.; Wieland, M.; Klingbeil, L.; Topfer, R.; Kuhlmann, H. Towards Automated Large-Scale 3D Phenotyping of Vineyards under Field Conditions. Sensors 2016, 16, 2136. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Whalley, J.; Shanmuganathan, S. Applications of image processing in viticulture: A review. In Proceedings of the MSSANZ-International Congress on Modelling and Simulation held at Adelaide Convention Centre, Adelaide, Australia, 1–6 December 2013; 2013; pp. 531–537. [Google Scholar]
  15. Victorino, G.; Braga, R.; Santos-Victor, J.; Lopes, C. Yield components detection and image-based indicators for non-invasive grapevine yield prediction at different phenological phases. OENO One 2020, 25, 833. [Google Scholar] [CrossRef]
  16. Sishodia, R.; Ray, R.; Singh, S. Applications of Remote Sensing in Precision Agriculture: A Review. Remote Sens. 2020, 12, 3136. [Google Scholar] [CrossRef]
  17. Boursianis, A.D.; Papadopoulou, M.S.; Diamantoulakis, P.; Liopa-Tsakalidi, A.; Barouchas, P.; Salahas, G.; Karagiannidis, G.; Wan, S.; Goudos, S.K. Internet of Things (IoT) and Agricultural Unmanned Aerial Vehicles (UAVs) in smart farming: A comprehensive review. Internet Things 2020, 100187. [Google Scholar] [CrossRef]
  18. Hall, A.; Lamb, D.; Holzapfel, B.; Louis, J. Optical remote sensing applications in viticulture - A review. Aust. J. Grape Wine Res. 2002, 8, 36–47. [Google Scholar] [CrossRef]
  19. Linaza, M.T.; Posada, J.; Bund, J.; Eisert, P.; Quartulli, M.; Döllner, J.; Pagani, A.; Olaizola, G.I.; Barriguinha, A.; Moysiadis, T.; et al. Data-Driven Artificial Intelligence Applications for Sustainable Precision Agriculture. Agronomy 2021, 11, 1227. [Google Scholar] [CrossRef]
  20. Arnó, J.; Casasnovas, M.i.; Ribes-Dasi, M.; Rosell-Polo, J. Review. Precision Viticulture. Research topics, challenges and opportunities in site-specific vineyard management. Span. J. Agric. Res. 2009, 7. [Google Scholar] [CrossRef] [Green Version]
  21. Lopes, C.; Graça, J.; Sastre, J.; Reyes, M.; Guzman, R.; Braga, R.; Monteiro, A.; Pinto, P. Vineyard yield estimation by vinbot robot - preliminary results with the white variety viosinho. In Proceedings of the 11th International Terroir Congress, McMinnville, OR, USA, 10–14 July, 2016; p. 516. [Google Scholar]
  22. Costa, R.; Fraga, H.; Malheiro, A.C.; Santos, J.A. Application of crop modelling to portuguese viticulture: Implementation and added-values for strategic planning. Cienc. Tec. Vitivinic. 2015, 30, 29–42. [Google Scholar] [CrossRef] [Green Version]
  23. Sirsat, M.; Moreira, J.; Ferreira, C.; Cunha, M. Machine Learning predictive model of grapevine yield based on agroclimatic patterns. Eng. Agric. Environ. Food 2019, 12. [Google Scholar] [CrossRef]
  24. Cunha, M.; Ribeiro, H.; Abreu, I. Pollen-based predictive modelling of wine production: Application to an arid region. Eur. J. Agron. 2015, 73, 42–54. [Google Scholar] [CrossRef] [Green Version]
  25. Padua, L.; Marques, P.; Adao, T.; Guimaraes, N.; Sousa, A.; Peres, E.; Sousa, J.J. Vineyard Variability Analysis through UAV-Based Vigour Maps to Assess Climate Change Impacts. Agronomy 2019, 9, 581. [Google Scholar] [CrossRef] [Green Version]
  26. Fraga, H.; Malheiro, A.; Moutinho Pereira, J.; Pinto, J.; Santos, J. Future scenarios for viticultural zoning in Europe: Ensemble projections and uncertainties. Int. J. Biometeorol. 2013, 2067. [Google Scholar] [CrossRef] [PubMed]
  27. Shanmuganathan, S. Viticultural Zoning for the Identification and Characterisation of New Zealand “Terroirs” Using Cartographic Data. In Proceedings of the GeoCart’2010 and ICA Symposium on Cartography, Auckland, New Zealand, 1–3 September 2010. [Google Scholar]
  28. Fraga, H.; García de Cortázar Atauri, I.; Malheiro, A.C.; Santos, J.A. Modelling climate change impacts on viticultural yield, phenology and stress conditions in Europe. Glob. Chang. Biol. 2016, 22, 3774–3788. [Google Scholar] [CrossRef] [PubMed]
  29. Santesteban, L.G.; Guillaume, S.; Royo, J.B.; Tisseyre, B. Are precision agriculture tools and methods relevant at the whole-vineyard scale? Precis. Agric. 2013, 14, 2–17. [Google Scholar] [CrossRef] [Green Version]
  30. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, n71. [Google Scholar] [CrossRef] [PubMed]
  31. Krzywinski, M.; Schein, J.; Birol, I.; Connors, J.; Gascoyne, R.; Horsman, D.; Jones, S.J.; Marra, M.A. Circos: An information aesthetic for comparative genomics. Genome Res. 2009, 19, 1639–1645. [Google Scholar] [CrossRef] [Green Version]
  32. Weiss, M.; Jacob, F.; Duveiller, G. Remote sensing for agricultural applications: A meta-review. Remote Sens. Environ. 2020, 236, 111402. [Google Scholar] [CrossRef]
  33. Guilpart, N.; Metay, A.; Gary, C. Grapevine bud fertility and number of berries per bunch are determined by water and nitrogen stress around flowering in the previous year. Eur. J. Agron. 2014, 54, 9–20. [Google Scholar] [CrossRef]
  34. Ubalde, J.M.; Sort, X.; Poch, R.M.; Porta, M. Influence of edapho-climatic factors on grape quality in Conca de Barbera vineyards (Catalonia, Spain). J. Int. Sci. Vigne Vin 2007, 41, 33–41. [Google Scholar]
  35. Taylor, J.A.; Dresser, J.L.; Hickey, C.C.; Nuske, S.T.; Bates, T.R. Considerations on spatial crop load mapping. Aust. J. Grape Wine Res. 2019, 25, 144–155. [Google Scholar] [CrossRef]
  36. Bindi, M.; Fibbi, L.; Gozzini, B.; Orlandini, S.; Miglietta, F. Modelling the impact of future climate scenarios on yield and yield variability of grapevine. Clim. Res. 1996, 7, 213–224. [Google Scholar] [CrossRef]
  37. Tarara, J.M.; Ferguson, J.C.; Blom, P.E.; Pitts, M.J.; Pierce, F.J. Estimation of grapevine crop mass and yield via automated measurements of trellis tension. Trans. Am. Soc. Agric. Eng. 2004, 47, 647–657. [Google Scholar] [CrossRef] [Green Version]
  38. Parr, B.; Legg, M.; Alam, F.; Bradley, S. Acoustic Identification of Grape Clusters Occluded by Foliage. In Proceedings of the 2020 IEEE Sensors Applications Symposium (SAS), Kuala Lumpur, Malaysia, 9–11 March 2020; pp. 1–6. [Google Scholar]
  39. Altherwy, Y.N.; McCann, J.A. SING: Free Space SensING of Grape Moisture using RF Shadowing. IEEE Trans. Instrum. Meas. 2020, 70, 6001112. [Google Scholar] [CrossRef]
  40. Palacios, F.; Bueno, G.; Salido, J.; Diago, M.P.; Hernández, I.; Tardaguila, J. Automated grapevine flower detection and quantification method based on computer vision and deep learning from on-the-go imaging using a mobile sensing platform under field conditions. Comput. Electron. Agric. 2020, 178, 105796. [Google Scholar] [CrossRef]
  41. Diago, M.P.; Sanz-Garcia, A.; Millan, B.; Blasco, J.; Tardaguila, J. Assessment of flower number per inflorescence in grapevine by image analysis under field conditions. J. Sci. Food Agric. 2014, 94, 1981–1987. [Google Scholar] [CrossRef] [PubMed]
  42. Aquino, A.; Millan, B.; Gutierrez, S.; Tardaguila, J. Grapevine flower estimation by applying artificial vision techniques on images with uncontrolled scene and multi-model analysis. Comput. Electron. Agric. 2015, 119, 92–104. [Google Scholar] [CrossRef]
  43. Aquino, A.; Millan, B.; Gaston, D.; Diago, M.-P.; Tardaguila, J. vitisFlower®: Development and Testing of a Novel Android-Smartphone Application for Assessing the Number of Grapevine Flowers per Inflorescence Using Artificial Vision Techniques. Sensors 2015, 15, 21204–21218. [Google Scholar] [CrossRef]
  44. Liu, S.; Li, X.; Wu, H.; Xin, B.; Tang, J.; Petrie, P.R.; Whitty, M. A robust automated flower estimation system for grape vines. Biosyst. Eng. 2018, 172, 110–123. [Google Scholar] [CrossRef]
  45. López-Miranda, S.; Yuste, J. Influence of flowers per cluster, fruit-set and berry weight on cluster weight in verdejo grapevine (Vitis vinifera L.). J. Int. Sci. Vigne Vin 2004, 38, 41–47. [Google Scholar] [CrossRef]
  46. Millan, B.; Aquino, A.; Diago, M.P.; Tardaguila, J. Image analysis-based modelling for flower number estimation in grapevine. J. Sci. Food Agric. 2017, 97, 784–792. [Google Scholar] [CrossRef]
  47. Rudolph, R.; Herzog, K.; Töpfer, R.; Steinhage, V. Efficient identification, localization and quantification of grapevine inflorescences and flowers in unprepared field images using Fully Convolutional Networks. Vitis J. Grapevine Res. 2019, 58, 95–104. [Google Scholar] [CrossRef]
  48. Millan, B.; Velasco-Forero, S.; Aquino, A.; Tardaguila, J. On-the-go grapevine yield estimation using image analysis and boolean model. J. Sens. 2018, 2018, 9634752. [Google Scholar] [CrossRef]
  49. Liu, S.; Cossell, S.; Tang, J.; Dunn, G.; Whitty, M. A computer vision system for early stage grape yield estimation based on shoot detection. Comput. Electron. Agric. 2017, 137, 88–101. [Google Scholar] [CrossRef]
  50. Demestihas, C.; Debuisson, S.; Descotes, A. Decomposing the notion of vine vigour with a proxydetection shoot sensor: Physiocap®. In Proceedings of the E3S Web of Conferences, Polanica-Zdrój, Poland, 16–18 April 2018. [Google Scholar]
  51. Moreno, H.; Rueda-Ayala, V.; Ribeiro, A.; Bengochea-Guevara, J.; Lopez, J.; Peteinatos, G.; Valero, C.; Andújar, D. Evaluation of Vineyard Cropping Systems Using On-Board RGB-Depth Perception. Sensors 2020, 20, 6912. [Google Scholar] [CrossRef]
  52. Henry, D.; Aubert, H.; Veronese, T.; Serrano, E. Remote estimation of intra-parcel grape quantity from three-dimensional imagery technique using ground-based microwave FMCW radar. IEEE Instrum. Meas. Mag. 2017, 20, 20–24. [Google Scholar] [CrossRef]
  53. Henry, D.; Aubert, H.; Véronèse, T. Proximal Radar Sensors for Precision Viticulture. IEEE Trans. Geosci. Remote Sens. 2019, 57, 4624–4635. [Google Scholar] [CrossRef]
  54. Font, D.; Pallejà, T.; Tresanchez, M.; Teixidó, M.; Martinez, D.; Moreno, J.; Palacín, J. Counting red grapes in vineyards by detecting specular spherical reflection peaks in RGB images obtained at night with artificial illumination. Comput. Electron. Agric. 2014, 108, 105–111. [Google Scholar] [CrossRef]
  55. Fernandez, R.; Montes, H.; Salinas, C.; Sarria, J.; Armada, M. Combination of RGB and Multispectral Imagery for Discrimination of Cabernet Sauvignon Grapevine Elements. Sensors 2013, 13, 7838–7859. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  56. Silver, D.L.; Monga, T. In Vino Veritas: Estimating Vineyard Grape Yield from Images Using Deep Learning. In Proceedings of the 32nd Canadian Conference on Artificial Intelligence, Ontario, Canada, May 28–31 2019; pp. 212–224. [Google Scholar]
  57. Liu, S.; Zeng, X.D.; Whitty, M. 3DBunch: A Novel iOS-Smartphone Application to Evaluate the Number of Grape Berries per Bunch Using Image Analysis Techniques. IEEE Access 2020, 8, 114663–114674. [Google Scholar] [CrossRef]
  58. Aquino, A.; Barrio, I.; Diago, M.P.; Milian, B.; Tardaguila, J. vitisBerry: An Android-smartphone application to early evaluate the number of grapevine berries by means of image analysis. Comput. Electron. Agric. 2018, 148, 19–28. [Google Scholar] [CrossRef]
  59. Santos, T.T.; de Souza, L.L.; dos Santos, A.A.; Avila, S. Grape detection, segmentation, and tracking using deep neural networks and three-dimensional association. Comput. Electron. Agric. 2020, 170. [Google Scholar] [CrossRef] [Green Version]
  60. Shanmuganathan, S.; Narayanan, A.; Robison, N. A cellular automaton framework for within-field vineyard variance and grape production simulation. In Proceedings of the 2011 Seventh International Conference on Natural Computation, Shanghai, China, 26–28 July 2011; pp. 1430–1435. [Google Scholar]
  61. Xin, B.; Liu, S.; Whitty, M. Three-dimensional reconstruction of Vitis vinifera(L.) cvs Pinot Noir and Merlot grape bunch frameworks using a restricted reconstruction grammar based on the stochastic L-system. Aust. J. Grape Wine Res. 2020, 26, 207–219. [Google Scholar] [CrossRef]
  62. Tang, J.; Woods, M.; Cossell, S.; Liu, S.; Whitty, M. Non-Productive Vine Canopy Estimation through Proximal and Remote Sensing. IFAC PapersOnLine 2016, 49, 398–403. [Google Scholar] [CrossRef]
  63. Moreno, H.; Valero, C.; Bengochea-Guevara, J.M.; Ribeiro, A.; Garrido-Izard, M.; Andujar, D. On-Ground Vineyard Reconstruction Using a LiDAR-Based Automated System. Sensors 2020, 20, 1102. [Google Scholar] [CrossRef] [Green Version]
  64. Tagarakis, A.C.; Koundouras, S.; Fountas, S.; Gemtos, T. Evaluation of the use of LIDAR laser scanner to map pruning wood in vineyards and its potential for management zones delineation. Precis. Agric. 2018, 19, 334–347. [Google Scholar] [CrossRef]
  65. Matese, A.; Di Gennaro, S.F. Beyond the traditional NDVI index as a key factor to mainstream the use of UAV in precision viticulture. Sci. Rep. 2021, 11. [Google Scholar] [CrossRef] [PubMed]
  66. Stamatiadis, S.; Taskos, D.; Tsadila, E.; Christofides, C.; Tsadilas, C.; Schepers, J.S. Comparison of passive and active canopy sensors for the estimation of vine biomass production. Precis. Agric. 2010, 11, 306–315. [Google Scholar] [CrossRef] [Green Version]
  67. Di Gennaro, S.F.; Toscano, P.; Cinat, P.; Berton, A.; Matese, A. A precision viticulture UAV-based approach for early yield prediction in vineyard. In Proceedings of the Precision Agriculture 2019 - Papers Presented at the 12th European Conference on Precision Agriculture, ECPA 2019, Montpellier, France, 8–11 July 2019; pp. 373–379. [Google Scholar]
  68. Arnó, J.; Escola, A.; Valles, J.M.; Llorens, J.; Sanz, R.; Masip, J.; Palacin, J.; Rosell-Polo, J.R. Leaf area index estimation in vineyards using a ground-based LiDAR scanner. Precis. Agric. 2013, 14, 290–306. [Google Scholar] [CrossRef] [Green Version]
  69. Serrano, L.; González-Flor, C.; Gorchs, G. Assessment of grape yield and composition using the reflectance based Water Index in Mediterranean rainfed vineyards. Remote Sens. Environ. 2012, 118, 249–258. [Google Scholar] [CrossRef] [Green Version]
  70. Ellis, R.; Moltchanova, E.; Gerhard, D.; Trought, M.; Yang, L. Using Bayesian growth models to predict grape yield. OENO One 2020, 54, 443–453. [Google Scholar] [CrossRef]
  71. Cola, G.; Mariani, L.; Salinari, F.; Civardi, S.; Bernizzoni, F.; Gatti, M.; Poni, S. Description and testing of a weather-based model for predicting phenology, canopy development and source-sink balance in Vitis vinifera L. cv. Barbera. Agric. For. Meteorol. 2014, 184, 117–136. [Google Scholar] [CrossRef]
  72. Fernández-González, M.; Ribeiro, H.; Piña-Rey, A.; Abreu, I.; Rodríguez-Rajo, F.J. Integrating Phenological, Aerobiological and Weather Data to Study the Local and Regional Flowering Dynamics of Four Grapevine Cultivars. Agronomy 2020, 10, 185. [Google Scholar] [CrossRef] [Green Version]
  73. Valdes-Gomez, H.; Celette, F.; de Cortazar-Atauri, I.G.; Jara-Rojas, F.; Ortega-Farias, S.; Gary, C. Modelling soil water content and grapevine growth and development with the stics crop-soil model under two different water management strategies. J. Int. Sci. Vigne Vin 2009, 43, 13–28. [Google Scholar] [CrossRef] [Green Version]
  74. Fraga, H.; Costa, R.; Moutinho-Pereira, J.; Correia, C.M.; Dinis, L.T.; Gonçalves, I.; Silvestre, J.; Eiras-Dias, J.; Malheiro, A.C.; Santos, J.A. Modeling phenology, water status, and yield components of three Portuguese grapevines using the STICS crop model. Am. J. Enol. Vitic. 2015, 66, 482–491. [Google Scholar] [CrossRef]
  75. Rovira-Más, F.; Sáiz-Rubio, V. Crop Biometric Maps: The Key to Prediction. Sensors 2013, 13, 12698–12743. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  76. Tarara, J.M.; Chaves, B.; Sanchez, L.A.; Dokoozlian, N.K. Use of cordon wire tension for static and dynamic prediction of grapevine yield. Am. J. Enol. Vitic. 2014, 65, 443–452. [Google Scholar] [CrossRef] [Green Version]
  77. Blom, P.E.; Tarara, J.M. Trellis Tension Monitoring Improves Yield Estimation in Vineyards. HortScience 2009, 44, 678–685. [Google Scholar] [CrossRef]
  78. Tarara, J.M.; Chaves, B.; Sanchez, L.A.; Dokoozlian, N.K. Analytical determination of the lag phase in grapes by remote measurement of trellis tension. HortScience 2013, 48, 453–461. [Google Scholar] [CrossRef] [Green Version]
  79. Santos, J.A.; Ceglar, A.; Toreti, A.; Prodhomme, C. Performance of seasonal forecasts of Douro and Port wine production. Agric. For. Meteorol. 2020, 291. [Google Scholar] [CrossRef]
  80. Gouveia, C.; Liberato, M.L.R.; DaCamara, C.C.; Trigo, R.M.; Ramos, M.A. Modelling past and future wine production in the Portuguese Douro Valley. Clim. Res. 2011, 48, 349–362. [Google Scholar] [CrossRef]
  81. Fraga, H.; Santos, J.A. Daily prediction of seasonal grapevine production in the Douro wine region based on favourable meteorological conditions. Aust. J. Grape Wine Res. 2017, 23, 296–304. [Google Scholar] [CrossRef]
  82. Besselat, B. Les prévisions de récolte en viticulture. OENO One 1987, 21, 1. [Google Scholar] [CrossRef]
  83. Cristofolini, F.; Gottardini, E. Concentration of airborne pollen of Vitis vinifera L. and yield forecast: A case study at S. Michele all’Adige, Trento, Italy. Aerobiologia 2000, 16, 125–129. [Google Scholar] [CrossRef]
  84. González-Fernández, E.; Piña-Rey, A.; Fernández-González, M.; Aira, M.J.; Rodríguez-Rajo, F.J. Prediction of Grapevine Yield Based on Reproductive Variables and the Influence of Meteorological Conditions. Agronomy 2020, 10, 714. [Google Scholar] [CrossRef]
  85. Cunha, M.; Marcal, A.R.S.; Silva, L. Very early prediction of wine yield based on satellite data from VEGETATION. Int. J. Remote Sens. 2010, 31, 3125–3142. [Google Scholar] [CrossRef]
  86. Ballesteros, R.; Intrigliolo, D.S.; Ortega, J.F.; Ramírez-Cuesta, J.M.; Buesa, I.; Moreno, M.A. Vineyard yield estimation by combining remote sensing, computer vision and artificial neural network techniques. Precis. Agric. 2020. [Google Scholar] [CrossRef]
  87. Maimaitiyiming, M.; Sagan, V.; Sidike, P.; Kwasniewski, M.T. Dual Activation Function-Based Extreme Learning Machine (ELM) for Estimating Grapevine Berry Yield and Quality. Remote Sens. 2019, 11, 740. [Google Scholar] [CrossRef] [Green Version]
  88. Seng, K.P.; Ang, L.; Schmidtke, L.M.; Rogiers, S.Y. Computer Vision and Machine Learning for Viticulture Technology. IEEE Access 2018, 6, 67494–67510. [Google Scholar] [CrossRef]
  89. Liu, S.; Whitty, M. Automatic grape bunch detection in vineyards with an SVM classifier. J. Appl. Log. 2015, 13, 643–653. [Google Scholar] [CrossRef]
  90. Coviello, L.; Cristoforetti, M.; Jurman, G.; Furlanello, C. GBCNet: In-Field Grape Berries Counting for Yield Estimation by Dilated CNNs. Appl. Sci. 2020, 10, 4870. [Google Scholar] [CrossRef]
  91. Pérez-Zavala, R.; Torres-Torriti, M.; Cheein, F.A.; Troni, G. A pattern recognition strategy for visual grape bunch detection in vineyards. Comput. Electron. Agric. 2018, 151, 136–149. [Google Scholar] [CrossRef]
  92. Font, D.; Tresanchez, M.; Martinez, D.; Moreno, J.; Clotet, E.; Palacin, J. Vineyard Yield Estimation Based on the Analysis of High Resolution Images Obtained with Artificial Illumination at Night. Sensors 2015, 15, 8284–8301. [Google Scholar] [CrossRef] [Green Version]
  93. Rancon, F.; Bombrun, L.; Keresztes, B.; Germain, C. Comparison of SIFT Encoded and Deep Learning Features for the Classification and Detection of Esca Disease in Bordeaux Vineyards. Remote Sens. 2019, 11, 1. [Google Scholar] [CrossRef] [Green Version]
  94. Milella, A.; Marani, R.; Petitti, A.; Reina, G. In-field high throughput grapevine phenotyping with a consumer-grade depth camera. Comput. Electron. Agric. 2019, 156, 293–306. [Google Scholar] [CrossRef]
  95. Kicherer, A.; Herzog, K.; Pflanz, M.; Wieland, M.; Ruger, P.; Kecke, S.; Kuhlmann, H.; Topfer, R. An Automated Field Phenotyping Pipeline for Application in Grapevine Research. Sensors 2015, 15, 4823–4836. [Google Scholar] [CrossRef] [PubMed]
  96. Ivorra, E.; Sánchez, A.J.; Camarasa, J.G.; Diago, M.P.; Tardaguila, J. Assessment of grape cluster yield components based on 3D descriptors using stereo vision. Food Control 2015, 50, 273–282. [Google Scholar] [CrossRef] [Green Version]
  97. Aquino, A.; Millan, B.; Diago, M.-P.; Tardaguila, J. Automated early yield prediction in vineyards from on-the-go image acquisition. Comput. Electron. Agric. 2018, 144. [Google Scholar] [CrossRef]
  98. Diago, M.P.; Correa, C.; Millan, B.; Barreiro, P.; Valero, C.; Tardaguila, J. Grapevine Yield and Leaf Area Estimation Using Supervised Classification Methodology on RGB Images Taken under Field Conditions. Sensors 2012, 12, 16988–17006. [Google Scholar] [CrossRef] [Green Version]
  99. Íñiguez, R.; Palacios, F.; Barrio, I.; Hernández, I.; Gutiérrez, S.; Tardaguila, J. Impact of Leaf Occlusions on Yield Assessment by Computer Vision in Commercial Vineyards. Agronomy 2021, 11, 1003. [Google Scholar] [CrossRef]
  100. Mirbod, O.; Yoder, L.; Nuske, S. Automated Measurement of Berry Size in Images. IFAC PapersOnLine 2016, 49, 79–84. [Google Scholar] [CrossRef]
  101. Torres-Sánchez, J.; Mesas-Carrascosa, F.J.; Santesteban, L.-G.; Jiménez-Brenes, F.M.; Oneka, O.; Villa-Llop, A.; Loidi, M.; López-Granados, F. Grape Cluster Detection Using UAV Photogrammetric Point Clouds as a Low-Cost Tool for Yield Forecasting in Vineyards. Sensors 2021, 21, 3083. [Google Scholar] [CrossRef]
  102. Kurtser, P.; Ringdahl, O.; Rotstein, N.; Berenstein, R.; Edan, Y. In-Field Grape Cluster Size Assessment for Vine Yield Estimation Using a Mobile Robot and a Consumer Level RGB-D Camera. IEEE Robot. Autom. Lett. 2020, 5, 2031–2038. [Google Scholar] [CrossRef]
  103. Hacking, C.; Poona, N.; Manzan, N.; Poblete-Echeverria, C. Investigating 2-D and 3-D Proximal Remote Sensing Techniques for Vineyard Yield Estimation. Sensors 2019, 19, 3652. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  104. Marinello, F.; Pezzuolo, A.; Cillis, D.; Sartori, L. Kinect 3D reconstruction for quantification of grape bunches volume and mass. In Proceedings of the Engineering for Rural Development, Bucharest, Romania, 25–27 May 2016; pp. 876–881. [Google Scholar]
  105. Di Gennaro, S.F.; Toscano, P.; Cinat, P.; Berton, A.; Matese, A. A Low-Cost and Unsupervised Image Recognition Methodology for Yield Estimation in a Vineyard. Front. Plant Sci. 2019, 10. [Google Scholar] [CrossRef] [Green Version]
  106. Riggio, G.; Fantuzzi, C.; Secchi, C. A Low-Cost Navigation Strategy for Yield Estimation in Vineyards. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 2200–2205. [Google Scholar]
  107. Aquino, A.; Diago, M.P.; Millán, B.; Tardáguila, J. A new methodology for estimating the grapevine-berry number per cluster using image analysis. Biosyst. Eng. 2017, 156, 80–95. [Google Scholar] [CrossRef]
  108. Nellithimaru, A.K.; Kantor, G.A. ROLS: Robust Object-Level SLAM for Grape Counting. In Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Long Beach, CA, USA, 16–17 June 2019; pp. 2648–2656. [Google Scholar]
  109. Schneider, T.; Paulus, G.; Anders, K.H. Towards predicting vine yield: Conceptualization of 3d grape models and derivation of reliable physical and morphological parameters. GI_Forum 2020, 8, 73–88. [Google Scholar] [CrossRef]
  110. Herrero-Huerta, M.; González-Aguilera, D.; Rodriguez-Gonzalvez, P.; Hernández-López, D. Vineyard yield estimation by automatic 3D bunch modelling in field conditions. Comput. Electron. Agric. 2015, 110, 17–26. [Google Scholar] [CrossRef]
  111. Hacking, C.; Poona, N.; Poblete-Echeverría, C. Vineyard yield estimation using 2-D proximal sensing: A multitemporal approach. OENO One 2020, 54, 793–812. [Google Scholar] [CrossRef]
  112. Liu, S.; Zeng, X.; Whitty, M. A vision-based robust grape berry counting algorithm for fast calibration-free bunch weight estimation in the field. Comput. Electron. Agric. 2020, 173, 105360. [Google Scholar] [CrossRef]
  113. Nuske, S.; Achar, S.; Bates, T.; Narasimhan, S.; Singh, S. Yield Estimation in Vineyards by Visual Grape Detectio. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, San Francisco, CA, USA, 25–30 September 2011. [Google Scholar]
  114. Dunn, G.M.; Martin, S.R. Yield prediction from digital image analysis: A technique with potential for vineyard assessments prior to harvest. Aust. J. Grape Wine Res. 2004, 10, 196–198. [Google Scholar] [CrossRef]
  115. Schöler, F.; Steinhage, V. Automated 3D reconstruction of grape cluster architecture from sensor data for efficient phenotyping. Comput. Electron. Agric. 2015, 114, 163–177. [Google Scholar] [CrossRef]
  116. Xue, J.; Su, B. Significant Remote Sensing Vegetation Indices: A Review of Developments and Applications. J. Sens. 2017, 2017, 1353691. [Google Scholar] [CrossRef] [Green Version]
  117. Henrich, V.; Götze, C.; Jung, A.; Sandow, C.; Thürkow, D.; Cornelia, G. Development of an online indices database: Motivation, concept and implementation. In Proceedings of the 6th EARSeL Imaging Spectroscopy SIG Workshop Innovative Tool for Scientific and Commercial Environment Applications, Tel Aviv, Israel, 16–18 March 2009. [Google Scholar]
  118. Johnson, D.M.; Mueller, R. Pre- and within-season crop type classification trained with archival land cover information. Remote Sens. Environ. 2021, 264. [Google Scholar] [CrossRef]
  119. Defourny, P.; Bontemps, S.; Bellemans, N.; Cara, C.; Dedieu, G.; Guzzonato, E.; Hagolle, O.; Inglada, J.; Nicola, L.; Rabaute, T.; et al. Near real-time agriculture monitoring at national scale at parcel resolution: Performance assessment of the Sen2-Agri automated system in various cropping systems around the world. Remote Sens. Environ. 2019, 221, 551–568. [Google Scholar] [CrossRef]
  120. López-Andreu, F.J.; Erena, M.; Dominguez-Gómez, J.A.; López-Morales, J.A. Sentinel-2 images and machine learning as tool for monitoring of the common agricultural policy: Calasparra rice as a case study. Agronomy 2021, 11, 621. [Google Scholar] [CrossRef]
  121. Janowicz, K.; Gao, S.; McKenzie, G.; Hu, Y.; Bhaduri, B. GeoAI: Spatially Explicit Artificial Intelligence Techniques for Geographic Knowledge Discovery and Beyond. Int. J. Geogr. Inf. Sci. 2020, 34, 1–13. [Google Scholar] [CrossRef]
Figure 1. Systematic review procedure for article selection.
Figure 1. Systematic review procedure for article selection.
Agronomy 11 01789 g001
Figure 2. Distribution according to scientific peer-reviewed journals (top 6 highlighted).
Figure 2. Distribution according to scientific peer-reviewed journals (top 6 highlighted).
Agronomy 11 01789 g002
Figure 3. Representation of the included records, by research methodology and year of publication.
Figure 3. Representation of the included records, by research methodology and year of publication.
Agronomy 11 01789 g003
Table 1. List of abbreviations and acronyms.
Table 1. List of abbreviations and acronyms.
Abbreviation/AcronymMeaning
AIArtificial Intelligence
BDABig Data Analytics
BNBerry number
CECCation Exchange Capacity
CIVCComité Interprofessionel du Vin de Champagne
DSSDecision Support System
FMCWFrequency-Modulated Continuous-Wave
GDDGrowing Degree Day
GNDVIGreen Normalized Difference Vegetation Index
GTGeo-spatial Technologies
IoTInternet of Things
LALeaf Area
LAILeaf Area Index
LiDARLight Detection And Ranging
MLRMultiple Linear Regression
MTVIModified Triangular Vegetation Index
NDVINormalized Difference Vegetation Index
PAPrecision Agriculture
PLSRPartial Least Squares Regression
PRISMAPreferred Reporting Items for Systematic Reviews and Meta-Analyses
PSProximal Sensing
PVPrecision Viticulture
ReLURectified Linear Unit
RFRadio Frequency
RFRRandom Forest Regression
RGBRed Green Blue
RGB-DRed Green Blue-Depth
RPIRegional Pollen Index
RSRemote Sensing
SARSynthetic Aperture Radar
SPOTSatellite Pour l’Observation de la Terre
STICSSimulateur mulTIdisciplinaire pour les Cultures Standard
TanhHyperbolic Tangent function
TTMTrellis Tension Monitor
UAVUnmanned Aerial Vehicle
UGVUnmanned Ground Vehicle
UTVUtility Terrain Vehicle
WIWater Index
WRELMWeighted Regularized Extreme Learning Machine
WRELM-TanhReTanhRe-based Weighted Regularized Extreme Learning Machine
Table 2. Summary of records included in the systematic review (data-driven Models based on computer vision and image processing).
Table 2. Summary of records included in the systematic review (data-driven Models based on computer vision and image processing).
ReferenceData SourcesTest EnvironmentScaleRelated VariablesEstimation
[10]Digital still RGB cameraField/LaboratorylocalCluster Weight, Berry Number per Cluster, Berry Size, Berry Weight0.76 < R2 < 0.96 (for all variables)
[4]Digital still RGB cameraLaboratory-basedlocalBerry number, Berry weight, Bunch Weight0.65 < R2 < 0.97 (for cluster weight)
[96]Bumblebee2 stereo cameraLaboratory-basedlocalCluster volume and compactness, berry number, size, and weight0.71 < R2 < 0.82 (for all variables)
[97]All-terrain vehicle (ATV) + RGB cameraIn-fieldlocalBerry number0.74 < R2 < 0.78 RMSE (for yield)
[40]All-terrain vehicle (ATV) + RGB cameraIn-fieldlocalNumber of flowersR2 > 0.70 (for yield)
[12]All-terrain vehicle (ATV) + RGB cameraIn-fieldlocalBerry detection, number of berries, cluster area, cluster weight0.41 < R2 < 0.75 (for yield)
[49]Video with Commercial Camera (Go Pro)In-fieldlocalNumber of grapevine shoots86.83% (for shoot detection) and 1.18% < error < 36.02% (for yield)
[51]RGB-D camera (Microsoft Kinect V2 sensor)In-fieldlocalBranch volumeR2 = 0.87 (for yield)
[98]Digital still RGB cameraIn-fieldlocalLeaf areaR2 = 0.73 (for yield)
[46]Handheld RGB cameraIn-fieldlocalNumber of flowers, berry weight0.49 < R2 < 0.91 (for yield)
[99]RGB cameraIn-fieldlocalLeaf occlusion, yield, bunch number and bunch weight0.42 < R2 < 0.87 (for yield)
[56]Smartphone cameraIn-fieldlocalBunch area0.51 < R2 < 0.54 (for yield)
[57]Smartphone RGB cameraIn-fieldlocalNumber of berries per bunch91% (for berries per bunch)
[41]Digital still RGB cameraLaboratory-basedlocalNumber of inflorescencesR2 > 0.80 (for number of inflorescences)
[100]All-terrain vehicle (ATV) + Stereo cameraIn-fieldlocalBerry size, volume, and weight0.76 < R2 < 0.96 (for berry weight)
[89]Digital still RGB cameraIn-fieldlocalBunch area87 to 90% (for bunch detection)
[55]RGB + Multipectral cameraIn-fieldlocal% of leaves, stems, branches, fruits and backgroundprecision: 89.7% (for fruits), 57.2% (for stems), 87.6% (for leaves), 5.4% (for branches)
[54]RGB camera at night under artificial lightingIn-fieldlocalBerry numberAverage error = −14% (for number of berries)
[1]Phenoliner-Field phenotyping platformIn-fieldlocalBerry number87% < berry identification < 94%
[47]Single-lens reflex (SLR) cameraIn-fieldlocalNumber of inflorescences and single flowersPrecision <70.7% (for flower extraction)
[101]UAV + RGB cameraIn-fieldlocalGrape cluster area0.75 < R2 < 0.82 (for harvest weight)
[90]Smartphone cameraIn-fieldlocalBerry numberaverage test error <5% (for berry number)
[59]Digital still RGB cameraIn-fieldlocalGrape detectionF1-score < 0,91 (for instance segmentation)
[42]Handheld RGB cameraIn-fieldlocalNumber of flowers0.86 < R2 < 0.99 (for number of flowers)
[102]RGB-D camera mounted on a mobile robotic platformIn-fieldlocalCluster Size2.8–3.5 cm average error (for cluster size)
[94]Intel RealSense RGB-D R200 imaging systemIn-fieldlocalCanopy volume, bunch detection and countingmaximum accuracy of 91.52% (for detected fruits)
[103]2-D RGB and 3-D RGB-D (Kinect sensor)Field/LaboratorylocalBunch area and volumeR2 = 0.89 (yield with RGB) R2 = 0.95 (yield with RGB-D)
[104]Microsoft Kinect™ RGB-depthLaboratory-basedlocalBunch volume trough 3D bunch reconstruction10% < error < 15% (for bunch volume)
[105]High resolution RGB images (20 MP) taken with a UAVIn-fieldlocalCluster number and sizeR2 = 0.82 (for yield)
[106]robot with SICK S300 Expert laser scanner + GoPro Hero 4Field/LaboratorylocalBerry number0.55 < R2 < 0.62 (for yield)
[7]RGB camera + Stereo Camera mounted on UTVIn-fieldlocalGrapes number3% < error < 4% (for yield)
[107]Smartphone (BQ Aquaris E5) RGB cameraIn-fieldlocalBerry number per cluster and cluster weight0.75 < R2 < 0.83 (for berry numbers per cluster)
[62]Multispectral Aerial Image + RGB camera (GoPro)In-fieldlocalNon-Productive vina canopy0.77 < precision (row) < 0.97 (for non-productive canopy)
[48]Cluster images, manually acquired vine images, and vine images captured on-the-go using a quad.In-fieldlocalNumber of berries in cluster images0.50 < R2 < 0.87 (for yield)
[91]RGB imagesField/LaboratorylocalGrape berries recognition and grape bunch detectionGrapes bunches detection = 88.61%; Single berries >99%.
[44]RGB cameraField/LaboratorylocalNumber of flowersaccuracy of 84.3% (for flower estimation)
[108]Stereo cameraIn-fieldlocalDense 3D model of a vineyard and count grapesR2 = 0.99 (for grape count)
[61]2D images from grape bunchesLaboratory-basedlocalThree-dimensional grape bunch reconstruction−0,4% < average percentage error <41.1% (for overall Rachis reconstruction performance)
[13]Track-driven vehicle consisting of a camera system, a real-time-kinematic GPS system (PHENObot)In-fieldlocalQuantity of grape bunches, berries, and the berry diameterAverage precision: 97.8% (berry yield)
[109]Multi-view image datasets from grapes using close-range photogrammetryLaboratory-basedlocalPhysical and morphological parameters from 3D grape modelsClose-range photogrammetry can be applied to generate 3D grape models parameters such as volume of the grape can be derived from these digital models
[92]RGB high-resolution images obtained with artificial illumination at nightIn-fieldlocalGrape-cluster image analysis parameters (area and volume)Error = 16% (for grape cluster area) −16.7% 8for grape cluster volume −0,3%(average)
[110]RGB imagesIn-fieldlocal3d grapevine point cloud, volume, mass and number of berries per bunchR2 = 0.75 (for bunch weight)
[111]RGB cameraField/LaboratorylocalBunch volume0.70 < R2 < 0.91 (for yield)
[112]RGB images with smartphone cameraIn-fieldlocal3D bunch reconstruction based on a single image0.82 < R2 < 0.95 (for berry number) 0.85 < R2 < 0.92 (for bunch weight)
[58]RGB images with smartphone camera APP (vitisBerry)Laboratory-basedlocalBerry counting on cluster imagesRecall = 0.8762–0.9082 Precision = 0.9392–0.9508
[43]RGB images with smartphone camera APP (vitisFlower)Laboratory-basedlocalNumber of Grapevine Flowers per Inflorescence84% of flowers in the captures were found, with a precision exceeding 94%
[15]Robot with RGB-D Kinect v2 camera and RGB cameraField/LaboratorylocalNumber of spurs, shoots, inflorescences, bunches, berries. Bunch volume, max length, and perimeter0.29 < R2 < 0.99 (between bunch weight and other bunch attributes)
[113]Sideways-facing camera and lighting on UTVIn-fieldlocalDetect and count grape berriesPredict yield of individual vineyard rows to within 9.8% of actual crop weight
[114]RGB imagesIn-fieldlocalAutomatically count ‘fruit’ pixels and the total number of pixels for each image0.85 < R2 < 0.99 (for fruit pixels/total image pixels vs fruit weight)
[67]High-resolution RGB images, acquired through an unmanned aerial vehicle (UAV)In-fieldlocalNumber of clusters and sizeHigh accuracy in yield
Table 3. Summary of records included in the systematic review (data-driven models based on vegetation indices).
Table 3. Summary of records included in the systematic review (data-driven models based on vegetation indices).
ReferenceData SourcesTest EnvironmentScaleRelated VariablesEstimation
[87]Vegetation indices derived from canopy spectraIn-fieldlocalVegetation indices derived from canopy spectra0.52 < R2 < 0.68 (for berry yield and quality parameters)
[80]Corine Land Cover map, wine statistics, monthly means of climate variables and NDVISimulatedRegionaltmax, tmin, tavg, prec, NDVI0.62 < R < 0.90 (for wine production)
[65]UAV multispectral cameraIn-fieldlocalNDVI, Canopy Geometry-Based IndicesR2 < 0.85 (for yield)
[5]Satellite-based (NDVI) and (LAI)In-fieldRegionalNDVI, LAI0.66 < R < 0.83 (for NDVI and Yield) and 0.66 < R < 0.83 (for LAI and Yield)
[11]Multispectral airborne imageryIn-fieldlocalNDVI, berry weight at harvest, bunch number per vine, and berry number per bunch−0.04 < r < 0.81 (for NDVI vs yield)
[85]Satellite data from vegetation (NDVI from SPOT)In-fieldRegionalNDVI0.73 < R2 < 0.84 (for yield)
[86]Quadcopter md4-1000 with a multispectral Sequoia cameraIn-fieldlocalRadiometry and geometry-based parameters (NDVI and Fc), water regime, fertilization, climate data0.60 < R2 < 0.96 (for yield)
Table 4. Summary of records included in the systematic review (data-driven models based on pollen).
Table 4. Summary of records included in the systematic review (data-driven models based on pollen).
ReferenceData SourcesTest EnvironmentScaleRelated VariablesEstimation
[83]Hirst type sampler volumetric spore trap
(Lanzoni VPPS-2000)
In-fieldRegionalAirborne pollen concentrationR2 = 0.92 (for grape production)
[8]Aerobiological data (Lanzoni VPPS-2000 volumetric trap)In-fieldRegionalMeteorological and phytopathological variablesR2 = 0.98 (for yield)
[72]Pollen Hirst volumetric sampler and Cour passive trapIn-fieldRegionalAirborne pollen concentration, weather dataR2 = 0.96 (Cour); R2 = 0.99 (Hirst)
[82]Pollen concentration dataIn-fieldRegionalAirborne pollen concentrationR2 < 0.98 (for yield)
[24]Airborne pollen trapSimulatedRegionalAirborne pollen concentration0.71 < R2 < 0.86 (for annual wine production)
[6]Aerobiological (Lanzoni VPPS-2000 volumetric trap) Phenological (BBCH standardized scale) Meteo dataIn-fieldlocalMeteorological, phenological and phytopathological variables0.79 < R2 < 0.99 (for yield)
[84]Aerobiological data (Lanzoni VPPS-2000 volumetric sampler), Meteorogical dataIn-fieldRegionalAirborne pollen concentration and Meteorologic dataR2 = 0.99 (for yield)
Table 5. Summary of records included in the systematic review (D—crop simulation models).
Table 5. Summary of records included in the systematic review (D—crop simulation models).
ReferenceData SourcesTest EnvironmentScaleRelated VariablesEstimation
[71]Weather data and plant characteristicsSimulated/In-field ValidationRegionalWeather data and plant characteristicsR2 = 0.96 (for yield in low-density canopies) R2 = 0.94 (for yield in high-density canopies)
[74]Climate, soil, and management practicesSimulated/In-field ValidationRegionalClimate data, soil and terrain parameters, water stress indices, management practicesR2 = 0.86 (for yield)
[73]Phenology and harvest date, Soil water content, water stress, and grapevine growth and yieldSimulated/In-field ValidationRegionalPhenology and harvest date, soil water content, water stress, and grapevine growth and yieldR2 = 0.85 (for yield)
[23]Weather, yield, phenological dates, fertilizer information, soil analysis, and maturation index dataSimulated/In-field ValidationRegionalWeather, phenological dates, fertilizer information, soil analysis, and maturation index data24.2% < RRMSE < 28.6%
Table 6. Summary of records included in the systematic review (E—data-driven models based on trellis tension).
Table 6. Summary of records included in the systematic review (E—data-driven models based on trellis tension).
ReferenceData SourcesTest EnvironmentScaleRelated VariablesEstimation
[37]Load cells installed in−line with the cordon wireIn-fieldlocalTension in the horizontal (cordon) support wire of the trellis0.99 < R2 < 0.99 (for tension and yield)
[78]Tension Sensor in main load-bearing wireIn-fieldlocalTiming for hand samplingn.d.
[76]Trellis Tension Monitors (TTMs)In-fieldlocalTension in the horizontal (cordon) support wire of the trellis0.81 < R2 < 0.98 (for yield)
n.d. means no data.
Table 7. Summary of records included in the systematic review (F—data-driven models based on laser data processing).
Table 7. Summary of records included in the systematic review (F—data-driven models based on laser data processing).
ReferenceData SourcesTest EnvironmentScaleRelated VariablesEstimation
[50]Physiocap ®-optical laserIn-fieldlocalShoot biomassR2 = 0.98 (for yield)
Table 8. Summary of records included in the systematic review (G—data-driven models based on radar data processing).
Table 8. Summary of records included in the systematic review (G—data-driven models based on radar data processing).
ReferenceData SourcesTest EnvironmentScaleRelated VariablesEstimation
[53]3-D radar imagery (FM-CW radar)In-fieldlocalPolarization and magnitude of radar echoes0.79 < R2 < 0.97 (for yield)
[52]24 GHz frequency-modulated continuous-wave (FMCW) radarIn-fieldlocalGrapes in grapevines from the radar echoes distribution in the interrogated 3D sceneR2 = 0.947 (for grape volume)
Table 9. Summary of records included in the systematic review (H—data-driven models based on radio frequency data processing).
Table 9. Summary of records included in the systematic review (H—data-driven models based on radio frequency data processing).
ReferenceData SourcesTest EnvironmentScaleRelated VariablesEstimation
[39]RF signalsLaboratory-basedlocalGrape moisture contentDegree of accuracy = 90% (for moisture content)
Table 10. Summary of records included in the systematic review (I—data-driven models based on ultrasonic signal processing.).
Table 10. Summary of records included in the systematic review (I—data-driven models based on ultrasonic signal processing.).
ReferenceData SourcesTest EnvironmentScaleRelated VariablesEstimation
[38]Ultrasonic ArrayLaboratory-basedlocalGrape cluster detectionAbility to propagate through foliage and reflect of grapes behind
Table 11. Summary of records included in the systematic review (B—other data-driven models).
Table 11. Summary of records included in the systematic review (B—other data-driven models).
ReferenceData SourcesTest EnvironmentApplicability ScaleRelated VariablesEstimation
[81]Daily historic meteorological conditions, yield dataIn-fieldRegionalTemperature and Precipitation0.68 ≤ r ≤ 0.84 (for grapevine production)
[34]Edapho-climatic dataIn-fieldLocalCation exchange capacity (CEC), Winkler indexR2 = 0.88 (CEC and Winkler Index for yield)
[45]Number of flowers per cluster, fruit-set percentage, berry weightIn-fieldLocalNumber of flowers per cluster, fruit-set percentage, berry weight0.54 < R2 < 0.93 (for number of flowers and yield)
[79]Monthly mean air temperatures and monthly total precipitation dataSimulatedRegionalMonthly mean air temperatures and monthly total precipitationWine production classes (1-low, 2-normal, 3-high): average estimation ratio of 79%(calibration) 67%(validation)
[70]Bunch mass dataIn-fieldLocalBunch mass datan.d.
[2]Leaf area, berry number, yield, water potential in summer, berry weight, sugar concentrationIn-fieldLocalLeaf area, berry number, yield, water potential in summer, berry weight, sugar concentrationLA/BN ratio estimated properly BW (R2 = 0.91))
n.d. means no data.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Barriguinha, A.; de Castro Neto, M.; Gil, A. Vineyard Yield Estimation, Prediction, and Forecasting: A Systematic Literature Review. Agronomy 2021, 11, 1789. https://doi.org/10.3390/agronomy11091789

AMA Style

Barriguinha A, de Castro Neto M, Gil A. Vineyard Yield Estimation, Prediction, and Forecasting: A Systematic Literature Review. Agronomy. 2021; 11(9):1789. https://doi.org/10.3390/agronomy11091789

Chicago/Turabian Style

Barriguinha, André, Miguel de Castro Neto, and Artur Gil. 2021. "Vineyard Yield Estimation, Prediction, and Forecasting: A Systematic Literature Review" Agronomy 11, no. 9: 1789. https://doi.org/10.3390/agronomy11091789

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop