Next Article in Journal
Unimodality of Parametric Linear Programming Solutions and Efficient Quantile Estimation
Next Article in Special Issue
Approximating a Function with a Jump Discontinuity—The High-Noise Case
Previous Article in Journal
Some Comments about Zero and Non-Zero Eigenvalues from Connected Undirected Planar Graph Adjacency Matrices
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Advanced Technologies and Artificial Intelligence in Agriculture

by
Alexander Uzhinskiy
Joint Institute for Nuclear Research, 6 Joliot-Curie, Dubna 141980, Russia
AppliedMath 2023, 3(4), 799-813; https://doi.org/10.3390/appliedmath3040043
Submission received: 6 September 2023 / Revised: 24 October 2023 / Accepted: 25 October 2023 / Published: 1 November 2023
(This article belongs to the Special Issue Application of Machine Learning and Deep Learning Methods in Science)

Abstract

:
According to the Food and Agriculture Organization, the world’s food production needs to increase by 70 percent by 2050 to feed the growing population. However, the EU agricultural workforce has declined by 35% over the last decade, and 54% of agriculture companies have cited a shortage of staff as their main challenge. These factors, among others, have led to an increased interest in advanced technologies in agriculture, such as IoT, sensors, robots, unmanned aerial vehicles (UAVs), digitalization, and artificial intelligence (AI). Artificial intelligence and machine learning have proven valuable for many agriculture tasks, including problem detection, crop health monitoring, yield prediction, price forecasting, yield mapping, pesticide, and fertilizer usage optimization. In this scoping mini review, scientific achievements regarding the main directions of agricultural technologies will be explored. Successful commercial companies, both in the Russian and international markets, that have effectively applied these technologies will be highlighted. Additionally, a concise overview of various AI approaches will be presented, and our firsthand experience in this field will be shared.

Graphical Abstract

1. Introduction

The Earth’s population is growing rapidly, although not as fast as in the 20th century. Still, by 2050, the population is projected to exceed 9 billion. According to the Food and Agriculture Organization, global food production needs to increase by 70 percent by 2050. Meanwhile, over the past few decades, the European agricultural workforce has declined by 35%. More than 54% of agricultural companies have cited a shortage of staff as their main challenge in the coming years. These factors have elevated the acceptance and necessity of advanced technologies in agriculture to a new level. Just as tractors and harvesters revolutionized agriculture in the 20th century, UAVs, robots, and artificial intelligence are currently transforming the industry. Intriguing applications of artificial intelligence have been found in various areas related to agriculture, such as biology, genetics, chemistry, and animal husbandry. However, this paper focuses specifically on farming and plant cultivation.
There are many interesting reviews related to specific technologies or sets of them. Xu et al. published a review [1] on agricultural IoT technology. Rejeb et al. [2] conducted a bibliometric analysis of publications related to drone applications in agriculture. Ojo et al. [3] focused on deep learning methods in controlled environment agriculture. Mail et al. [4] presented a review of agricultural harvesting robot technologies. Wang et al. [5] reviewed the applications of machine vision in agricultural robot navigation. Dayioglu and Turker [6] and Abbasi et al. [7] have published extensive reviews on Agriculture 4.0, which combines the most advanced modern technologies. An excellent review on Artificial Intelligence in Agriculture was conducted by Oliveira et al. [8]. The motivation behind this mini review is to explore the main advanced technologies in agriculture and AI tasks related to them. The aim is to showcase not only scientific achievements but also successful commercial companies in both the Russian and international markets that have effectively applied these technologies. Practitioners and researchers can use these reviews to find areas for applying their talents or to discover technological and business solutions that elevate agriculture to a new level.
Let us briefly explore the main technologies in agriculture, progressing from the top to the bottom. Satellite imagery enables soil classification [9,10,11,12,13,14,15,16,17,18,19,20,21], field monitoring [22,23,24], planning assistance [25], and yield prediction [26,27,28,29]. Open programs are mostly limited—both spatially and temporally. At the same time, commercial missions offer imagery with a resolution of less than 50 cm and the ability to obtain images of a specific area on demand. Depending on the analyzed area, UAVs can offer similar capabilities to satellite imagery, but with higher accuracy and flexibility. They can perform tasks such as soil analysis [30,31,32], seedling density tracking [33,34], weed and pest detection and classification [35,36,37,38], yield prediction [39,40,41], and harvest readiness assessment. In some rare cases, UAVs can be used for harvesting, precision fertilization [42,43,44], pesticide injections [45,46,47], or even mechanical pest destruction. The Internet of Things and sensors provide farmers with real-time information on soil parameters, temperature, gases in the air, weather conditions, and many other parameters, which are often passed to cloud infrastructures and can then be used for analysis and prediction [48,49,50,51]. Autonomous platforms (robots, unmanned ground vehicles (UGVs)) and tractors can execute warehouse operations, as well as carry out common farming operations such as planting [52], fertilization [53], pruning [54], weeding [55], hilling, and crop condition monitoring [56,57,58]. When it comes to greenhouse farming, navigation becomes challenging, limiting the application of UAVs and setting specific requirements for autonomous platforms, such as moving on different types of surfaces, working in different light conditions, and processing high-resolution imagery [59,60,61,62]. Controlled environment farming has experienced significant advancements; however, it primarily serves markets with high demand for fresh products and low electricity costs, such as the OAE region. On such farms, autonomous platforms and UAVs are not as popular due to the limited working space, but a number of operations can still be automated. Autonomous growing in a completely controlled environment looks highly possible in the coming decades [3,63,64]. Digital platforms integrate information from various sources, equipping farmers with convenient tools for planning, growing, control, prediction, and even product placements and distribution [65,66,67,68].
The Meshcheryakov Laboratory of Information Technologies [69] at the Joint Institute for Nuclear Research [70] focuses on various areas of AI application in agriculture, mainly in the field of classification and detection tasks. Unfortunately, the details of some executed projects are subject to non-disclosure agreements (NDA). Therefore, this paper provides an excellent opportunity to publish general information about our research. Our notable achievement is the development of a plant disease detection platform along with its accompanying mobile application called DoctorP [71]. This platform is capable of classifying more than 70 different diseases and incorporates 26 specialized models designed for specific crops. Our approach utilizes one-shot learning, particularly employing Siamese neural networks with the triplet loss function, in addition to other advanced training techniques. The overall accuracy of our general model and specialized models reaches an impressive 97% [72]. Furthermore, we are actively involved in the detection of various diseases on potatoes. In this regard, we employ RGB and hyperspectral images, as well as utilize various YOLO-family networks, to tackle the detection task. Our aim is to propose practical solutions that can be implemented in the field, providing valuable insights and aiding in decision-making processes. Furthermore, we have conducted studies on the impact of light on crop evolution during different phenophases. Our interests also encompass utilizing computer vision and neural networks for monitoring tasks in greenhouse environments.
The remainder of the paper is organized as follows: Section 2 introduces advanced technologies in agriculture, namely Section 2.1 Application of satellite imagery, Section 2.2 Application of unmanned aerial vehicles, Section 2.3 Application of autonomous platforms and tractors, Section 2.4 Advances in controlled environment farming, and Section 2.5 General information about sensors and digitalization. Section 3 provides an overview of the MLIT activities related to artificial intelligence in agriculture. Finally, the conclusion is presented in Section 4.

2. Advanced Technologies in Agriculture

2.1. Satellite Imagery

NASA [73] reports that, as of 30 April 2022, 5465 active artificial satellites are orbiting the Earth. Figure 1A displays an artist’s depiction of Earth-orbiting satellites. Some notable missions, such as Landsat [74], Sentinel [75], and Modis [76], provide their data for free. The resolution of imagery from these programs starts at 15 m, and the spectral range of the images is quite wide. However, the orbital cycle is relatively long, usually taking up to 16–18 days, and no one can guarantee a cloudless sky. Nevertheless, this data can be utilized to obtain various interesting information about territories.
Soil mapping and classification can be accomplished using this data via basic statistical methods (support vector machine [9], random forest [10], gradient boosting [11], decision tree [12], cubist algorithm [13], etc.) or sophisticated neural algorithms (artificial neural network [14], recurrent neural network [15], deep recurrent neural network [16], deep neural network [17], deep convolutional neural network [18,19,20,21], etc.). Effective land utilization strategies can be created by analyzing not only fresh information, but also historical data. Common users have the option to perform some analyses themselves using Google Earth Engine [78], a cloud-based geospatial analysis platform; however, it requires experience and some IT skills. Alternatively, using specialized services [79] may be a more convenient approach.
The OneSoil project [80] serves as an excellent example of a free ready-to-use project where satellite imagery, artificial intelligence, and digitalization serve farmers. The interface of the OneSoil Yield mobile app is presented in Figure 2. Pasture.io [81] and GeoPard Agriculture [82] are examples of commercial services that utilize satellite data for precision agriculture. The resolution of the methods is sufficient for general analysis, control, planning assistance, and yield prediction. However, for more accurate data, commercial satellite products are necessary.
Missions such as Maxar [83], Airbus [84] or Planet.com [77] provide high-resolution imagery with tens of spectral channels. One can request images of a specific area at an appropriate time. The Russian project known as Aerospace-agro [85] offers various tools for precision farming, allowing enterprises to increase their yield and revenue. AgroScout [86] is an Israeli company that provides agronomy services based on various sources of information, including Airbus high-resolution satellite data. The resolution of commercial products can be 0.5 m or even less. Despite this, it may not always be enough for an accurate diagnosis of many problems. Additionally, the cost of the data is relatively high. Nonetheless, the ability to analyze vast territories and acquire data on demand makes satellite imagery highly attractive for agronomy purposes.

2.2. Unmanned Aerial Vehicles

UAVs are incredibly useful for obtaining detailed information about fields. Equipped with multispectral and high-resolution cameras, they excel in tasks such as object detection, classification, and segmentation. Their resolution capabilities allow farmers to effectively monitor seedling density [33,34], weed, and pest presence [35,36,37,38]. Multispectral cameras enable the capture of NDVI (Normalized Difference Vegetation Index) and other essential indexes to assess soil and plant conditions [87]. Specialized UAVs can handle precision fertilization [42,43,44], pesticide application [45,46,47], harvesting, and even dealing with flying pests.
Despite their benefits, operating drones over fields can be challenging because it requires obtaining various permissions and the expertise of specialists to ensure proper functioning. Nevertheless, they remain one of the best options for farmers to monitor their enterprise and predict crop yields. UAVs are becoming as commonplace as tractors on farms. A farmer can purchase affordable devices suitable for basic analysis, learn how to operate them, and access free specialized software to create field maps and evaluate their condition. For more complex and professional analyses, farmers can utilize commercial services.
Geoscan [88] is a group of companies engaged in the development and production of unmanned aerial vehicles and related devices, along with software for photogrammetric data processing. Figure 3 presents the Geoscan Lite drone and a multispectral camera. AssistAgro [89], on the other hand, is a digital platform designed for effective agribusiness management, with a focus on tasks such as forecasting weed emergence and development, calculating crop plant density and placement quality, predicting agricultural plant diseases, etc. The AssistAgro software interface is presented in Figure 4.
On the international market, DJI [90], Wingtra [91], Parrot [92], Mica-Sense [93], and Delair [94] are well-known producers of unmanned aerial vehicles and related devices. Companies like Taranis [95], Aerobotics [96], See Tree [97], FarmAir [98], and UAV-IQ [99] can be mentioned as services that provide different analysis options for farmers based on drone imagery.

2.3. Autonomous Platforms and Tractors

In agriculture, autonomous platforms (unmanned ground vehicles, robots) assist with various warehouse operations, as well as monitor fields and execute special agronomic tasks. The payload capacity of surface-based platforms is significantly larger than that of UAVs. However, drones have distinct advantages in monitoring and control operations due to their mobility. On the other hand, autonomous platforms find their ideal application in executing agrotechnical operations, such as planting [52], harvesting [4], fertilization [53], pruning [54], weeding [55], watering [100], hilling, etc. While fully automated devices are relatively rare, partly automated ones are more prevalent. These platforms come in different sizes, complexities, and applications; nevertheless, they all employ computer vision and neural algorithms to move, monitor the situation, and perform tasks with high speed and precision. [101,102,103] For instance, platforms can deploy different tools, such as pesticides, fire, electricity, brutal force, or lasers, to remove weeds and address issues much faster and more effectively than humans [104,105,106,107].
Regarding fully automated platforms, companies like Naïo Technologies [108], AgXeed [109], John Deere [110], Agrointelli [111], FarmDroid [112], and Aigro [113] provide advanced solutions. As for the Russian market, there are no significant projects in the area that can be mentioned. The Siberian tiger project [114] showed promising results, but no production solution was deployed, leading to its discontinuation. Instead, the project team shifted focus toward the commercial development of robotized platforms.
However, notable achievements have been made in auto-pilot systems, enabling tractors to execute intelligent functions and replace human drivers with operators. Cognitive Technologies [115] has installed artificial intelligence-based autonomous control systems on more than 100 tractors in Russian agricultural enterprises from Pskov to Blagoveshchensk since the spring of 2023. The Cognitive Technologies hardware installed in the cabin of the tractor is presented in Figure 5.
The cost of such a system is estimated to be around 5–10% of the total machine cost. Implementing this system allows for enhanced labor productivity (up to 25%), fuel savings (7%), and the conservation of other resources. Companies like Autonomous Solutions (Petersboro, UT, USA) [116], AgJunction Inc. (Scottsdale, AZ, USA) [117], CNH Industrial N.V. (Basildon, UK) [118], Mahindra & Mahindra Limited (Mumbai, India) [119], and Kubota (Osaka, Japan) [120] are top players in the international market.
When it comes to harvesting robots, fruits, vegetables, and berries are the primary target [4,107]. The task involves detecting objects, assessing their state, recording their coordinates, and passing this information to the harvesting mechanism. Both UAVs and autonomous platforms can be utilized for harvesting yields. However, only a few research projects related to fruit harvesting are known on the national Russian market, and they are still far from the production stage. On the international market, there are companies like Tevel [121], MetoMotion [122], AgroBot [123], and Dogtooth Technologies [124] that provide solutions for harvesting various crops. In Figure 6, a variation of harvesting robots is presented.

2.4. Controlled Environment Farming

Increased levels of growing technologies and the evolution of LED lights have enabled the establishment of farms in commercial centers, parking lots, restaurants, and other locations. Vertical farming has become a highly popular field, attracting significant investments from many companies. The market has now stabilized. Some companies have had to downsize their workforce and relocate to more suitable areas. The range of crops grown is currently limited to lettuce, herbs, and berries, and the prices of these products can be several times higher than those grown in traditional fields. Nonetheless, in places with low electricity charges and high demand for fresh produce, this technology remains highly marketable.
Artificial intelligence plays a significant role in controlled environment farming, specifically in tasks such as object detection, classification, and prediction [3]. Given the high production costs, farm owners must accurately anticipate their yields and organize efficient distribution. Neural (deep recurrent neural network, deep neural network) and statistical (random forest, gradient boosting, decision tree, multivariate regression, association rule learning) models are also applied to analyze the current situation and determine optimal parameters to maximize crop yield [125,126,127]. Controlled environments allow owners to adjust various factors, such as light, humidity, fertilizer levels in soil or solutions, and more. Existing guidelines outline how to cultivate different crops and respond to changes in various parameters on the farm. Consequently, a cohesive virtual agronomist software suite aids in making well-informed decisions.
Controlled environment farming heavily relies on the automation of all processes. Robots are capable of planting, moving, and harvesting crops. However, fully autonomous farms are still a conceptual idea and have not yet been fully realized [63,64,128].
In the Russian market, there are several notable companies, such as iFarm [129] and Mestnye korni [130], which have established themselves as experienced developers in this field. GALAD Green Line [131] is conducting research on the influence of different light spectra on the growth potential of various crops and offers its virtual agronomist cloud solution. Photos from vertical farms are presented in Figure 7.
The top players on the international market are AeroFarms [132], Bowery Farming [133], Futurae Farms [134], AgriCool [135], and CubicFarms [136].

2.5. Sensors and Digitalization

Digitalization is one of the main trends in agronomy. Notebooks, tablets, or phones are as common tools for farmers as a shovel and watering can used to be. Different digital systems cover all production phases from planning and seed selection to the distribution of ready products. Open knowledge bases help to find the best solutions and share experiences. Cloud-based systems help to connect farmers with consumers and provide government, insurance companies, and banks with information about enterprise activities. In such systems, a farmer can find appropriate suppliers, services, and a workforce [65,66,67,68]. The Russian Agricultural Bank [137] is the primary provider of versatile digital platforms in the Russian agricultural sector. Its flagship project “Svoe Fermerstvo” [138] offers a comprehensive suite of services and opportunities for efficient agribusiness. Agraroom [139] and Pole.rf [140] are additional instances of functional and thriving digital platforms. Platforms such as Farmers Business Network [141], Agriconomie [142], and Farmkart [143] serve as excellent illustrations of multifunctional digital platforms for farmers in the international market.
Another aspect of digitalization is its ability to obtain detailed real-time data about farms. Low-cost sensors placed on the field, in greenhouses, or vertical farms, as well as sophisticated sensors placed on board of platforms or UAVs, can provide information about humidity, soil moisture, salinity, temperature, and nutrient levels among other useful metrics [1,48,49,50,51]. All this data is transferred to local or global cloud systems. Another source of information about weather and prices, as well as historical data, is used to generate a massive amount of data. Statistical and machine learning algorithms are utilized for planning, forecasting, and finding the best way to maximize yield while minimizing expense and environmental impact [144,145,146,147]. Data-driven precision agriculture allows farmers to have a digital twin of their enterprise, providing real-time control and enabling prompt action. This integration of technology and agriculture represents the future of the industry. The Digital Agro company [148], the creator of the Agrosignal platform [149], is a prominent leader in the digitalization of agriculture within the Russian market.
CropX [150], Agrivi [151], Agworld [152], Farmbrite [153], and Harvest profit [154] are international companies providing useful software for smart farming.
Similar to the concept of Enterprise 4.0, Agriculture 4.0 leverages advanced technologies such as the Internet of Things (IoT), big data, artificial intelligence, unmanned aerial vehicles (UAVs), robotics, and more to enhance, connect, accelerate, and optimize activities that impact the entire production chain. Agriculture 4.0 encompasses the tools and strategies that enable the synergistic utilization of various technologies [155] to enhance economic, environmental, and social sustainability, as well as the profitability of agricultural processes. Notable comprehensive reviews on the subject of digital transformation in agriculture have been authored by Dayioglu and Turker [6] and Abbasi et al. [7].

3. MLIT Activities Related to Artificial Intelligence in Agriculture

There are competent specialists in the field of machine learning and neural networks, as well as a great source base for model training and algorithm verification, at the Meshcheryakov Laboratory of Information Technologies (MLIT) [69]. For this reason, collaboration with MLIT attracts various companies and organizations, leading to the initiation of interesting projects in different areas, including agriculture. When certain research projects are subject to Non-Disclosure Agreements (NDAs), only limited, general information about them can be disclosed.
One prominent project in this domain is the disease detection platform and its mobile application called DoctorP [71]. This platform enables users to submit a photo of a plant and receive predictions of possible diseases and treatment recommendations from agronomists. The platform offers a web interface, a Telegram bot, a mobile application, and an API for external services. Since the beginning of 2023, the platform has received over 80,000 user requests. The DoctorP mobile app interface is presented in Figure 8.
To identify the best approaches for model training, the MLIT team has tested different state-of-the-art neural network architectures, auto-augmentation policies, and loss minimization functions, including contrastive, triplet, arcface, cosface, and sphereface. Currently, we use MobileNet as the architectural base for small models with less than 50 classes and ConvNeXt for larger models. The Triplet loss function is employed for model training, and augmentation and quantization are not used. As a result, all models achieve an impressive accuracy of over 97% [72]. No research has been found where disease classification is conducted on a self-collected dataset comprising more than 70 classes. However, the research by Cui et al. [156] and Gomes et al. [157] employs very interesting techniques that deserve mention.
The same classification approach was utilized in a joint project with the Temiryazev Academy as part of the World-class Scientific Center “Agrotechnologies of the Future” [158]. In this collaboration, we focused on investigating how light in different spectra affected crops during various phenophases. The trained models were successful in classifying the degree of plant development and determining the weight group of each plant. For studies involving classification in plant phenotyping, Kolhar and Jagtap have provided a comprehensive review [159].
In another project, we tackled the control of lettuces on the growing line. To accomplish this, we used a two-stage algorithm: YOLO architecture for lettuce detection, followed by classification using a one-shot model. This approach significantly reduced the data markup procedure and increased the accuracy to an impressive 99%. We also applied similar methodologies to classify food on trunks with favorable outcomes.
Additionally, in collaboration with Doka–Gennyye Tekhnologii [160], we executed a project focused on detecting different diseases in potatoes. Two main directions were pursued. First, we developed methods for detecting diseases visible through RGB cameras, where YOLO 8 and YOLO NAS demonstrated the most promising results for object detection and instance segmentation tasks. We incorporated these solutions into portable computer complexes mounted on sanitary tractors. The captured images from the cameras are processed by our models, and the number of detected objects, along with geographical positions, is sent to the server to generate a heat map of the field. Figure 9A presents the results of the models’ performance on images captured by a camera mounted on a tractor. Sinshaw et al. [161] have authored an excellent review on methods for potato disease detection.
The second direction involved working with hyperspectral images of potatoes to identify diseased plants before the appearance of visible symptoms. We explored various image classification and pixel-level analysis algorithms to determine the most effective approach. Polder et al. conducted noteworthy research [162] focused on detecting virus Y using hyperspectral images.
We are also working on an automated analysis of plant states in greenhouse complexes. Our goal is to simplify routine operations for agronomists and provide them with convenient tools for monitoring plant health in these environments. A key challenge in greenhouses is the identification of pests in their early stages and their precise location on the leaves. Martin et al. [163] and Tiwari et al. [164] propose intriguing solutions; however, they come with restrictions regarding operating heights and surfaces. Our project requires the development of autonomous robotic platforms capable of moving on different surfaces and operating at heights of up to 4 m. We are currently working on various tasks, including segmentation, classification, object detection, and tracking to achieve these objectives. Several model outputs and experiments related to the project are displayed in Figure 10.
By the end of 2023, the testing of a prototype platform and system in one of the greenhouse complexes in the Kaluga region is planned to begin.

4. Conclusions

To feed the growing population, agriculture companies must utilize the advanced technologies of smart farming. Some of these technologies are still in the development phase, while others are already being offered by commercial enterprises. Right now, farms can leverage a range of cutting-edge tools, such as satellite data, UAVs, autonomous platforms, sensors, and robots to acquire detailed information about the current status of their crops and soil, as well as to perform common agronomic operations. Nevertheless, the adoption of new agricultural technologies remains limited in many countries, creating significant opportunities for both established businesses and startups.
Artificial intelligence plays a significant role in many modern agriculture technologies, enabling solutions for various challenges, including crop health monitoring, yield prediction, price forecasting, yield mapping, pesticide and fertilizer usage optimization, etc. While state-of-the-art results have already been achieved in some areas, others require further efforts. Tasks like detection and navigation in complex conditions, the execution of precise operations, and forecasting and control of multi-parameter systems offer opportunities for researchers to apply their expertise.
Precision agriculture, with a focus on ensuring food security and optimizing resources like water, fertilizers, and pesticides, as well as conducting comprehensive analysis and control across the entire enterprise, represents the future of the industry. Over the next few decades, fully autonomous devices operating on farms and autonomous cultivation in controlled environments are likely to become a reality. At MLIT, we leverage our infrastructure to train classification, detection, and segmentation models that find applications in various agricultural contexts. Our emphasis extends beyond developing scientific solutions to creating readily deployable applications that benefit the agricultural community.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Xu, J.; Gu, B.; Tian, G. Review of agricultural IoT technology. Artif. Intell. Agric. 2022, 6, 10–22. [Google Scholar] [CrossRef]
  2. Rejeb, A.; Abdollahi, A.; Rejeb, K.; Treiblmaier, H. Drones in agriculture: A review and bibliometric analysis. Comput. Electron. Agric. 2022, 198, 107017. [Google Scholar] [CrossRef]
  3. Ojo, M.O.; Zahid, A. Deep Learning in Controlled Environment Agriculture: A Review of Recent Advancements, Challenges and Prospects. Sensors 2022, 22, 7965. [Google Scholar] [CrossRef] [PubMed]
  4. Mail, M.F.; Maja, J.M.; Marshall, M.; Cutulle, M.; Miller, G.; Barnes, E. Agricultural Harvesting Robot Concept Design and System Components: A Review. AgriEngineering 2023, 5, 777–800. [Google Scholar] [CrossRef]
  5. Wang, T.; Chen, B.; Zhang, Z.; Li, H.; Zhang, M. Applications of machine vision in agricultural robot navigation: A review. Comput. Electron. Agric. 2022, 198, 107085. [Google Scholar] [CrossRef]
  6. Dayioglu, M.A.; Turker, U. Digital transformation for sustainable future agriculture 4.0: A review. J. Agric. Sci. 2021, 27, 373–399. [Google Scholar]
  7. Abbasi, R.; Martinez, P.; Ahmad, R. The digitization of agricultural industry—A systematic literature review on agriculture 4.0. Smart Agric. Technol. 2022, 2, 100042. [Google Scholar] [CrossRef]
  8. Oliveira, R.C.d.; Silva, R.D.d.S.e. Artificial Intelligence in Agriculture: Benefits, Challenges, and Trends. Appl. Sci. 2023, 13, 7405. [Google Scholar] [CrossRef]
  9. Li, X.; Sun, C.; Meng, H.; Ma, X.; Huang, G.; Xu, X. A Novel Efficient Method for Land Cover Classification in Fragmented Agricultural Landscapes Using Sentinel Satellite Imagery. Remote Sens. 2022, 14, 2045. [Google Scholar] [CrossRef]
  10. Forkuor, G.; Hounkpatin, O.K.; Welp, G.; Thiel, M. High Resolution Mapping of Soil Properties Using Remote Sensing Variables in South-Western Burkina Faso: A Comparison of Machine Learning and Multiple Linear Regression Models. PLoS ONE 2017, 12, e0170478. [Google Scholar] [CrossRef]
  11. Dindaroğlu, T.; Kılıç, M.; Günal, E.; Gündoğan, R.; Akay, A.E.; Seleiman, M. Multispectral UAV and Satellite Images for DigitalSoil Modeling with Gradient Descent Boosting and Artificial Neural Network. Earth Sci. Inform. 2022, 15, 2239–2263. [Google Scholar] [CrossRef]
  12. Berhane, T.M.; Lane, C.R.; Wu, Q.; Autrey, B.C.; Anenkhonov, O.A.; Chepinoga, V.V.; Liu, H. Decision-Tree, Rule-Based, and Random Forest Classification of High-Resolution Multispectral Imagery for Wetland Mapping and Inventory. Remote Sens. 2018, 10, 580. [Google Scholar] [CrossRef] [PubMed]
  13. Silvero, N.E.; Dematte, J.A.; Vieira, J.D.S.; Mello, F.A.D.O.; Amorim, M.T.A.; Poppiel, R.R.; Mendes, W.D.S.; Bonfatti, B.R. Soil property maps with satellite images at multiple scales and its impact on management and classification. Geoderma 2021, 397, 115089. [Google Scholar] [CrossRef]
  14. Ghaderi, A.; Abbaszadeh Shahri, A.; Larsson, S. An artificial neural network based model to predict spatial soil type distribution using piezocone penetration test data (CPTu). Bull. Eng. Geol. Environ. 2019, 78, 4579–4588. [Google Scholar] [CrossRef]
  15. Bermudez, J.D.; Achanccaray, P.; Sanches, I.D.; Cue, L.; Happ, P. Evaluation of Recurrent Neural Networks for Crop Recognition from Multitemporal Remote Sensing Images. In Proceedings of the Anais do XXVII Congresso Brasileiro de Cartografia, Rio de Janeiro, Brazil, 6–9 November 2017; pp. 800–804. [Google Scholar]
  16. Ndikumana, E.; Ho Tong Minh, D.; Baghdadi, N.; Courault, D.; Hossard, L. Deep Recurrent Neural Network for Agricultural Classification using multitemporal SAR Sentinel-1 for Camargue, France. Remote Sens. 2018, 10, 1217. [Google Scholar] [CrossRef]
  17. Mohan, S.; Kapil Dev, T. Pixel based classification for Landsat 8 OLI multispectral satellite images using deep learning neural network. Remote Sens. Appl. Soc. Environ. 2021, 24, 100645. [Google Scholar] [CrossRef]
  18. Pandey, A.; Kumar, D.; Chakraborty, D.B. Soil Type Classification from High Resolution Satellite Images with Deep CNN. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium IGARSS, Brussels, Belgium, 11–16 July 2021; pp. 4087–4090. [Google Scholar] [CrossRef]
  19. Naushad, R.; Kaur, T.; Ghaderpour, E. Deep Transfer Learning for Land Use and Land Cover Classification: A Comparative Study. Sensors 2021, 21, 8083. [Google Scholar] [CrossRef]
  20. Rabiei, S.; Jalilvand, E.; Tajrishy, M. A Method to Estimate Surface Soil Moisture and Map the Irrigated Cropland Area Using Sentinel-1 and Sentinel-2 Data. Sustainability 2021, 13, 11355. [Google Scholar] [CrossRef]
  21. Paul, S.; Kumari, M.; Murthy, C.S.; Kumar, D.N. Generating pre-harvest crop maps by applying convolutional neural network on multi-temporal Sentinel-1 data. Int. J. Remote Sens. 2022, 43, 6078–6101. [Google Scholar] [CrossRef]
  22. Peng, Y.; Liu, Z.; Lin, C.; Hu, Y.; Zhao, L.; Zou, R.; Wen, Y.; Mao, X. A new method for estimating soil fertility using extreme gradient boosting and a backpropagation neural network. Remote Sens. 2022, 14, 3311. [Google Scholar] [CrossRef]
  23. Virnodkar, S.S.; Pachghare, V.K.; Patil, V.C.; Jha, S.K. DenseResUNet: An Architecture to Assess Water-Stressed Sugarcane Crops from Sentinel-2 Satellite Imagery. Trait. Du Signal 2021, 38, 1131–1139. [Google Scholar] [CrossRef]
  24. Pignatti, S.; Casa, R.; Laneve, G.; Li, Z.; Liu, L.; Marzialetti, P.; Mzid, N.; Pascucci, S.; Silvestro, P.C.; Tolomio, M.; et al. Sino-EU Earth Observation Data to Support the Monitoring and Management of Agricultural Resources. Remote Sens. 2021, 13, 2889. [Google Scholar] [CrossRef]
  25. Goswami, B.; Nayak, P. Optimization in agricultural growth using AI and satellite imagery. In Data Science in Societal Applications; Springer Nature: Singapore, 2022; pp. 107–125. [Google Scholar]
  26. Ji, Z.; Pan, Y.; Zhu, X.; Zhang, D.; Dai, J. Prediction of corn yield in the USA corn belt using satellite data and machine learning: From an Evapotranspiration perspective. Agriculture 2022, 12, 1263. [Google Scholar] [CrossRef]
  27. Luo, Y.; Zhang, Z.; Cao, J.; Zhang, L.; Zhang, J.; Han, J.; Zhuang, H.; Cheng, F.; Tao, F. Accurately mapping global wheat production system using deep learning algorithms. Int. J. Appl. Earth Obs. Geoinf. 2022, 110, 102823. [Google Scholar] [CrossRef]
  28. Xie, Y. Combining CERES-Wheat model, Sentinel-2data, and deep learning method for winter wheat yield estimation. Int. J. Remote Sens. 2022, 43, 630–648. [Google Scholar] [CrossRef]
  29. Watson-Hernandez, F.; Gomez-Calderon, N.; da Silva, R.P. Oil palm yield estimation based on vegetation and humidity indices generated from satellite images and machine learning techniques. AgriEngineering 2022, 4, 279–291. [Google Scholar] [CrossRef]
  30. Huuskonen, J.; Oksanen, T. Soil sampling with drones and augmented reality in precision agriculture. Comput. Electron. Agric. 2018, 154, 25–35. [Google Scholar] [CrossRef]
  31. Zhou, J.; Xu, Y.; Gu, X.; Chen, T.; Sun, Q.; Zhang, S.; Pan, Y. High-Precision Mapping of Soil Organic Matter Based on UAV Imagery Using Machine Learning Algorithms. Drones 2023, 7, 290. [Google Scholar] [CrossRef]
  32. Bertalan, L.; Holb, I.; Pataki, A.; Negyesi, G.; Szabo, G.; Kupasne Szaloki, A.; Szabo, S. UAV-based multispectral and thermal cameras to predict soil water content—A machine learning approach. Comput. Electron. Agric. 2022, 200, 107262. [Google Scholar] [CrossRef]
  33. Wilke, N.; Siegmann, B.; Postma, J.; Muller, O.; Krieger, V.; Pude, R.; Rascher, U. Assessment of plant density for barley and wheat using UAV multispectral imagery for high-throughput field phenotyping. Comput. Electron. Agric. 2021, 189, 106380. [Google Scholar] [CrossRef]
  34. Koh, J.C.O.; Hayden, M.; Daetwyler, H. Estimation of crop plant density at early mixed growth stages using UAV imagery. Plant Methods 2019, 15, 64. [Google Scholar] [CrossRef] [PubMed]
  35. Ong, P.; Teo, K.S.; Sia, C.K. UAV-based weed detection in Chinese cabbage using deep learning. Smart Agric. Technol. 2023, 4, 100181. [Google Scholar] [CrossRef]
  36. Khan, S.; Tufail, M.; Khan, M.T.; Khan, Z.A.; Iqbal, J.; Alam, M. A novel semi-supervised framework for UAV based crop/weed classification. PLoS ONE 2021, 16, e0251008. [Google Scholar] [CrossRef] [PubMed]
  37. Tetila, E.C.; Machado, B.B.; Astolfi, G.; de Souza Belete, N.A.; Amorim, W.P.; Roel, A.R.; Pistori, H. Detection and classification of soybean pests using deep learning with UAV images. Comput. Electron. Agric. 2020, 179, 105836. [Google Scholar] [CrossRef]
  38. Mohidem, N.A.; Che Ya, N.N.; Juraimi, A.S.; Fazlil Ilahi, W.F.; Mohd Roslim, M.H.; Sulaiman, N.; Saberioon, M.; Mohd Noor, N. How Can Unmanned Aerial Vehicles Be Used for Detecting Weeds in Agricultural Fields? Agriculture 2021, 11, 1004. [Google Scholar] [CrossRef]
  39. Kumar, C.; Mubvumba, P.; Huang, Y.; Dhillon, J.; Reddy, K. Multi-Stage Corn Yield Prediction Using High-Resolution UAV Multispectral Data and Machine Learning Models. Agronomy 2023, 13, 1277. [Google Scholar] [CrossRef]
  40. Zeng, L.; Peng, G.; Meng, R.; Man, J.; Li, W.; Xu, B.; Lu, Z.; Sun, R. Wheat Yield Prediction Based on Unmanned Aerial Vehicles-Collected Red–Green–Blue Imagery. Remote Sens. 2021, 13, 2937. [Google Scholar] [CrossRef]
  41. Shahi, B.; Xu, C.Y.; Neupane, A.; Fleischfresser, D.; O’Connor, D.; Wright, G.; Guo, W. Peanut yield prediction with UAV multispectral imagery using a cooperative machine learning approach. Electron. Res. Arch. 2023, 31, 3343–3361. [Google Scholar] [CrossRef]
  42. Chen, P.; Ouyang, F.; Zhang, Y.; Lan, Y. Preliminary Evaluation of Spraying Quality of Multi-Unmanned Aerial Vehicle (UAV) Close Formation Spraying. Agriculture 2022, 12, 1149. [Google Scholar] [CrossRef]
  43. Song, C.; Liu, L.; Wang, G.; Han, J.; Zhang, T.; Lan, Y. Particle Deposition Distribution of Multi-Rotor UAV-Based Fertilizer Spreader under Different Height and Speed Parameters. Drones 2023, 7, 425. [Google Scholar] [CrossRef]
  44. Su, D.; Yao, W.; Yu, F.; Liu, Y.; Zheng, Z.; Wang, Y.; Xu, T.; Chen, C. Single-Neuron PID UAV Variable Fertilizer Application Control System Based on a Weighted Coefficient Learning Correction. Agriculture 2022, 12, 1019. [Google Scholar] [CrossRef]
  45. Anand, K.; Goutam, R. An autonomous UAV for pesticides praying. Int. J. Trend Sci. Res. Dev. 2019, 3, 986–990. [Google Scholar]
  46. Ivić, S.; Andrejčuk, A.; Družeta, S. Autonomous control for multi-agent non-uniform spraying. Appl. Soft Comput. 2019, 80, 742–760. [Google Scholar] [CrossRef]
  47. Sinha, J.P. Aerial robot for smart farming and enhancing farmers’ net benefit. Indian J. Agric. Sci. 2020, 90, 258–267. [Google Scholar] [CrossRef]
  48. Dhanaraju, M.; Chenniappan, P.; Ramalingam, K.; Pazhanivelan, S.; Kaliaperumal, R. Smart Farming: Internet of Things (IoT)-Based Sustainable Agriculture. Agriculture 2022, 12, 1745. [Google Scholar] [CrossRef]
  49. Gagliardi, G.; Lupia, M.; Cario, G.; Cicchello Gaccio, F.; D’Angelo, V.; Cosma, A.I.M.; Casavola, A. An Internet of Things Solution for Smart Agriculture. Agronomy 2021, 11, 2140. [Google Scholar] [CrossRef]
  50. Madushanki, R.; Halgamuge, M.; Wirasagoda, S.; Syed, A. Adoption of the Internet of Things (IoT) in Agriculture and Smart Farming towards Urban Greening: A Review. Int. J. Adv. Comput. Sci. Appl. 2019, 10, 11–28. [Google Scholar] [CrossRef]
  51. Bilotta, G.; Genovese, E.; Citroni, R.; Cotroneo, F.; Meduri, G.M.; Barrile, V. Integration of an Innovative Atmospheric Forecasting Simulator and Remote Sensing Data into a Geographical Information System in the Frame of Agriculture 4.0 Concept. AgriEngineering 2023, 5, 1280–1301. [Google Scholar] [CrossRef]
  52. Azmi, H.N.; Hajjaj, S.S.H.; Gsangaya, K.R.; Sultan, M.T.H.; Mail, M.F.; Hua, L.S. Design and fabrication of an agricultural robot for crop seeding. Mater. Today Proc. 2023, 81, 283–289. [Google Scholar] [CrossRef]
  53. Ghafar, A.S.A.; Hajjaj, S.S.H.; Gsangaya, K.R.; Sultan, M.T.H.; Mail, M.F.; Hua, L.S. Design and development of a robot for spraying fertilizers and pesticides for agriculture. Mater. Today Proc. 2023, 81, 242–248. [Google Scholar] [CrossRef]
  54. Otani, T.; Itoh, A.; Mizukami, H.; Murakami, M.; Yoshida, S.; Terae, K.; Tanaka, T.; Masaya, K.; Aotake, S.; Funabashi, M. Agricultural Robot under Solar Panels for Sowing, Pruning, and Harvesting in a Synecoculture Environment. Agriculture 2023, 13, 18. [Google Scholar] [CrossRef]
  55. Raffik, R.; Mayukha, S.; Hemchander, J.; Abishek, D.; Tharun, R.; Kumar, S.D. Autonomous Weeding Robot for Organic Farming Fields. In Proceedings of the International Conference on Advancements in Electrical, Electronics, Communication, Computing and Automation, Coimbatore, India, 8–9 October 2021; pp. 1–4. [Google Scholar] [CrossRef]
  56. Pandiaraj, K.; Prakash, K.J.; Dhanalakshmi, K.S.; Teja, M.S.; Kalyan, K.P.; Basha, S.M. Autonomous Robot for Field Health Indication and Crop Monitoring System using Artificial Intelligence. In Proceedings of the 2nd International Conference on Advance Computing and Innovative Technologies in Engineering (ICACITE), Greater Noida, India, 28–29 April 2022; pp. 937–942. [Google Scholar] [CrossRef]
  57. Chandana, R.; Nisha, M.; Pavithra, B.; Sumana, S.; Nagashree, R.N. A Multipurpose Agricultural Robot for Automatic Ploughing, Seeding and Plant Health Monitoring. Int. J. Eng. Res. Technol. (IJERT) IETE 2020, 8, 57–60. [Google Scholar]
  58. Cubero, S.; Marco-noales, E.; Aleixos, N.; Barbé, S.; Blasco, J. Robhortic: A field robot to detect pests and diseases in horticultural crops by proximal sensing. Agriculture 2020, 10, 276. [Google Scholar] [CrossRef]
  59. Seo, D.; Cho, B.-H.; Kim, K.-C. Development of Monitoring Robot System for Tomato Fruits in Hydroponic Greenhouses. Agronomy 2021, 11, 2211. [Google Scholar] [CrossRef]
  60. Wu, C.; Tang, X.; Xu, X. System Design, Analysis, and Control of an Intelligent Vehicle for Transportation in Greenhouse. Agriculture 2023, 13, 1020. [Google Scholar] [CrossRef]
  61. Rosero-Montalvo, P.D.; Gordillo-Gordillo, C.A.; Hernandez, W. Smart Farming Robot for Detecting Environmental Conditions in a Greenhouse. IEEE Access 2023, 11, 57843–57853. [Google Scholar] [CrossRef]
  62. Saddik, A.; Latif, R.; Taher, F.; El Ouardi, A.; Elhoseny, M. Mapping Agricultural Soil in Greenhouse Using an Autonomous Low-Cost Robot and Precise Monitoring. Sustainability 2022, 14, 15539. [Google Scholar] [CrossRef]
  63. Karanisa, T.; Achour, Y.; Ouammi, A. Smart greenhouses as the path towards precision agriculture in the food-energy and water nexus: Case study of Qatar. Environ. Syst. Decis. 2022, 42, 521–546. [Google Scholar] [CrossRef]
  64. Gan, C.I.; Soukoutou, R.; Conroy, D.M. Sustainability Framing of Controlled Environment Agriculture and Consumer Perceptions: A Review. Sustainability 2023, 15, 304. [Google Scholar] [CrossRef]
  65. Kenney, M.; Serhan, H.; Trystram, G. Digitization and Platforms in Agriculture: Organizations, Power Asymmetry, and Collective Action Solutions; SSRN Scholarly Paper ID 3638547; Social Science Research Network: Rochester, NY, USA, 2020; 50p. [Google Scholar] [CrossRef]
  66. Kolmykova, T.; Kazarenkova, N.; Merzlyakova, E.; Aseev, O.; Kovalev, P. Digital platforms in the new world of digital agricultural business. IOP Conf. Ser. Earth Environ. Sci. 2021, 941, 012008. [Google Scholar] [CrossRef]
  67. Borrero, J.D.; Mariscal, J. A Case Study of a Digital Data Platform for the Agricultural Sector: A Valuable Decision Support System for Small Farmers. Agriculture 2022, 12, 767. [Google Scholar] [CrossRef]
  68. Singh, N.; Kapoor, S. Configuring the agricultural platforms: Farmers’ preferences for design attributes. J. Agribus. Dev. Emerg. Econ. 2023. [Google Scholar] [CrossRef]
  69. The Meshcheryakov Laboratory of Information Technologies Site. Available online: https://lit.jinr.ru/ (accessed on 10 October 2023).
  70. The Joint Institute for Nuclear Research Site. Available online: http://www.jinr.ru/main-en/ (accessed on 10 October 2023).
  71. The DoctorP Project Site. Available online: https://doctorp.org/ (accessed on 10 October 2023).
  72. Uzhinskiy, A.; Ososkov, G.; Goncharov, P.; Nechaevskiy, A.; Smetanin, A. Oneshot learning with triplet loss for vegetation classification tasks. Comput. Opt. 2021, 45, 608–614. [Google Scholar] [CrossRef]
  73. The National Aeronautics and Space Administration Site. Available online: https://www.nasa.gov/ (accessed on 10 October 2023).
  74. The Landsat Mission Site. Available online: https://www.usgs.gov/landsat-missions (accessed on 10 October 2023).
  75. The Sentinel Mission Site. Available online: https://sentinels.copernicus.eu/ (accessed on 10 October 2023).
  76. The Modis Site. Available online: https://modis.gsfc.nasa.gov/ (accessed on 10 October 2023).
  77. The Panet Project Site. Available online: https://www.planet.com/ (accessed on 10 October 2023).
  78. The Google Earth Engine Project Site. Available online: https://earthengine.google.com/ (accessed on 10 October 2023).
  79. Mendes, J.; Pinho, T.M.; Neves dos Santos, F.; Sousa, J.J.; Peres, E.; Boaventura-Cunha, J.; Cunha, M.; Morais, R. Smartphone Applications Targeting Precision Agriculture Practices—A Systematic Review. Agronomy 2020, 10, 855. [Google Scholar] [CrossRef]
  80. The Onesoil Project Site. Available online: https://onesoil.ai/ (accessed on 10 October 2023).
  81. Pasture.io Site. Available online: https://pasture.io/ (accessed on 10 October 2023).
  82. GeoPard Agriculture Project Site. Available online: https://geopard.tech/ (accessed on 10 October 2023).
  83. Maxar Technologies Site. Available online: https://www.maxar.com/ (accessed on 10 October 2023).
  84. Airbus Satelite Project Site. Available online: https://www.intelligence-airbusds.com/imagery/ (accessed on 10 October 2023).
  85. Aerospace-Agro Projects Site. Available online: https://www.aerospace-agro.com/ (accessed on 10 October 2023).
  86. The AgroScout Company Site. Available online: https://agro-scout.com/ (accessed on 10 October 2023).
  87. Radočaj, D.; Šiljeg, A.; Marinović, R.; Jurišić, M. State of Major Vegetation Indices in Precision Agriculture Studies Indexed in Web of Science: A Review. Agriculture 2023, 13, 707. [Google Scholar] [CrossRef]
  88. The Geoscan Group Company Site. Available online: https://www.geoscan.ru/ (accessed on 10 October 2023).
  89. The Assist Agro Site. Available online: https://agroassist.ru/ (accessed on 10 October 2023).
  90. The DJI Company Site. Available online: https://ag.dji.com/ (accessed on 10 October 2023).
  91. The Wingtra Company Site. Available online: https://wingtra.com/ (accessed on 10 October 2023).
  92. The Parrot Company Site. Available online: https://www.parrot.com/ (accessed on 10 October 2023).
  93. The MicaSense Company Site. Available online: https://support.micasense.com/ (accessed on 10 October 2023).
  94. The Dalair Company Site. Available online: https://delair.aero/ (accessed on 10 October 2023).
  95. The Taranis Company Site. Available online: https://www.taranis.com/ (accessed on 10 October 2023).
  96. The Aerobotics Company Site. Available online: https://www.aerobotics.com/ (accessed on 10 October 2023).
  97. The See Tree Company Site. Available online: https://www.seetree.ai/ (accessed on 10 October 2023).
  98. The FarmAir Company Site. Available online: https://farmair.io/ (accessed on 10 October 2023).
  99. The UAV-IQ Site. Available online: https://www.uaviq.com/ (accessed on 10 October 2023).
  100. Hassan, A.; Asif, R.; Rehman, A.; Nishtar, Z.; Kaabar, M.; Afsar, K. Design and development of an Irrigation Mobile Robot. IAES Int. J. Robot. Autom. 2021, 10, 75–90. [Google Scholar] [CrossRef]
  101. Li, Y.; Li, J.; Zhou, W.; Yao, Q.; Nie, J.; Qi, X. Robot Path Planning Navigation for Dense Planting Red Jujube Orchards Based on the Joint Improved A* and DWA Algorithms under Laser SLAM. Agriculture 2022, 12, 1445. [Google Scholar] [CrossRef]
  102. Bai, Y.; Zhang, B.; Xu, N.; Zhou, J.; Shi, J.; Diao, Z. Vision-based navigation and guidance for agricultural autonomous vehicles and robots: A review. Comput. Electron. Agric. 2023, 205, 107584. [Google Scholar] [CrossRef]
  103. Wang, H.; Gu, J.; Wang, M. A review on the application of computer vision and machine learning in the tea industry. Front. Sustain. Food Syst. 2023, 7, 1172543. [Google Scholar] [CrossRef]
  104. Wu, X.; Aravecchia, S.; Lottes, P.; Stachniss, C.; Pradalier, C. Robotic weed control using automated weed and crop classification. J. Field Robot. 2020, 37, 322–340. [Google Scholar] [CrossRef]
  105. Li, Y.; Guo, Z.; Shuang, F.; Zhang, M.; Li, X. Key technologies of machine vision for weeding robots: A review and benchmark. Comput. Electron. Agric. 2022, 196, 106880. [Google Scholar] [CrossRef]
  106. Adeniji, A.; Jack, K.; Idris, M.; Oyewobi, S.; Musa, H.; Oyelami, A. Deployment of an Artificial Intelligent Robot for Weed Management in Legumes Farmland. ABUAD J. Eng. Res. Dev. 2023, 6, 28–38. [Google Scholar] [CrossRef]
  107. Droukas, L.; Doulgeri, Z.; Tsakiridis, N.L. A Survey of Robotic Harvesting Systems and Enabling Technologies. J. Intell. Robot. Syst. 2023, 107, 21. [Google Scholar] [CrossRef] [PubMed]
  108. The Naïo Technologies Site. Available online: https://www.naio-technologies.com/ (accessed on 10 October 2023).
  109. The AgXeed Company Site. Available online: https://www.agxeed.com/ (accessed on 10 October 2023).
  110. The John Deer Company Site. Available online: https://www.deere.com/ (accessed on 10 October 2023).
  111. The Agrointelli Company Site. Available online: https://agrointelli.com/ (accessed on 10 October 2023).
  112. The FarmDroid Company Site. Available online: https://farmdroid.dk/ (accessed on 10 October 2023).
  113. The Aigro Project Site. Available online: https://www.aigro.nl/ (accessed on 10 October 2023).
  114. The Latest News Related to the Siberian Tiger Project. Available online: http://agrofarm.vdnh.ru/news/379-kompaniya-agrirobot-predstavila-sel-skokhozyajstvennogo-robota-siberian-tiger-na-vdnkh (accessed on 10 October 2023).
  115. The Cognitive Technologies Company Site. Available online: https://cognitive.ru/ (accessed on 10 October 2023).
  116. The Autonomous Solutions, Inc. Company Site. Available online: https://asirobots.com/ (accessed on 10 October 2023).
  117. The AgJunction Inc. Company Site. Available online: https://www.agjunction.com/ (accessed on 10 October 2023).
  118. The CNH Industrial N.V. Company Site. Available online: https://www.cnhindustrial.com/ (accessed on 10 October 2023).
  119. The Mahindra & Mahindra Limited Company Site. Available online: https://www.mahindra.com/ (accessed on 10 October 2023).
  120. The Kubota Corporation Site. Available online: https://www.kubota.com/ (accessed on 10 October 2023).
  121. The Tevel Company Site. Available online: https://www.tevel-tech.com/ (accessed on 10 October 2023).
  122. The MetoMotion Company Site. Available online: https://metomotion.com/ (accessed on 10 October 2023).
  123. The AgroBot Company Site. Available online: https://www.agrobot.com/ (accessed on 10 October 2023).
  124. The Dogtooth Technologies Company Site. Available online: https://dogtooth.tech/ (accessed on 10 October 2023).
  125. Farhangi, H.; Mozafari, V.; Roosta, H.R. Optimizing growth conditions in vertical farming: Enhancing lettuce and basil cultivation through the application of the Taguchi method. Sci. Rep. 2023, 13, 6717. [Google Scholar] [CrossRef] [PubMed]
  126. Shasteen, K.; Kacira, M. Predictive Modeling and Computer Vision-Based Decision Support to Optimize Resource Use in Vertical Farms. Sustainability 2023, 15, 7812. [Google Scholar] [CrossRef]
  127. Avgoustaki, D.D.; Xydis, G. Energy cost reduction by shifting electricity demand in indoor vertical farms with artificial lighting. Biosyst. Eng. 2021, 211, 219–229. [Google Scholar] [CrossRef]
  128. Ruli, R.; Seminar, K.; Wahjuni, S.; Santosa, E. Vertical Farming Perspectives in Support of Precision Agriculture Using Artificial Intelligence: A Review. Computers 2022, 11, 135. [Google Scholar] [CrossRef]
  129. The iFarm Company Site. Available online: https://ifarm.fi/ (accessed on 10 October 2023).
  130. The Mestnye Korni Company Site. Available online: https://localroots.ru/ (accessed on 10 October 2023).
  131. The GALAD Green Line Site. Available online: https://npcsvet.ru/ (accessed on 10 October 2023).
  132. The AeroFarms Company Site. Available online: https://www.aerofarms.com/ (accessed on 10 October 2023).
  133. The Bowery Farming Company Site. Available online: https://bowery.co/ (accessed on 10 October 2023).
  134. The Futurae Farms Company Site. Available online: https://www.futuraefarms.com/ (accessed on 10 October 2023).
  135. The Agritecture Company Site. Available online: https://www.agritecture.com/agricool (accessed on 10 October 2023).
  136. The CubicFarms Company Site. Available online: https://cubicfarms.com/ (accessed on 10 October 2023).
  137. The Russian Agricultural Bank Site. Available online: https://old.rshb.ru/ (accessed on 10 October 2023).
  138. Svoe Fermerstvo Projects Site. Available online: https://svoefermerstvo.ru/ (accessed on 10 October 2023).
  139. The AgraRoom Projects Site. Available online: https://agraroom.ru/ (accessed on 10 October 2023).
  140. The Pole.rf Projects Site. Available online: https://xn--e1alid.xn--p1ai/ (accessed on 10 October 2023).
  141. The Farmers Business Network Platform Site. Available online: https://www.fbn.com/ (accessed on 10 October 2023).
  142. The Agriconomy Platform Site. Available online: https://www.agriconomie.com/ (accessed on 10 October 2023).
  143. The Farmcart Platform Site. Available online: https://farmkartgroup.com/ (accessed on 10 October 2023).
  144. Kethineni, K.; Pradeepini, G. An Overview of Smart Agriculture Activities using Machine Learning and IoT; AIP Publishing: Melville, NY, USA, 2023. [Google Scholar] [CrossRef]
  145. Acharya, B.; Garikapati, K.; Yarlagadda, A.; Dash, S. Internet of Things (IoT) and Data Analytics in Smart Agriculture: Benefits and Challenges; Academic Press: San Diego, CA, USA, 2022. [Google Scholar] [CrossRef]
  146. Luyckx, M.; Reins, L. The Future of Farming: The (Non)-Sense of Big Data Predictive Tools for Sustainable EU Agriculture. Sustainability 2022, 14, 12968. [Google Scholar] [CrossRef]
  147. Cravero, A.; Sepúlveda, S. Use and Adaptations of Machine Learning in Big Data—Applications in Real Cases in Agriculture. Electronics 2021, 10, 552. [Google Scholar] [CrossRef]
  148. The Digital Agro Company Site. Available online: https://digitalagro.ru/ (accessed on 10 October 2023).
  149. The Agrosignal Digital Platform Site. Available online: https://agrosignal.com/ (accessed on 10 October 2023).
  150. The CropX Site. Available online: https://cropx.com/ (accessed on 10 October 2023).
  151. The Agrivi Site. Available online: https://www.agrivi.com/ (accessed on 10 October 2023).
  152. The Agworld Site. Available online: https://www.agworld.com/eu/ (accessed on 10 October 2023).
  153. The Farmbrite Site. Available online: https://www.farmbrite.com/ (accessed on 10 October 2023).
  154. The Harvest Profit Site. Available online: https://www.harvestprofit.com/ (accessed on 10 October 2023).
  155. Barrile, V.; Simonetti, S.; Citroni, R.; Fotia, A.; Bilotta, G. Experimenting Agriculture 4.0 with Sensors: A Data Fusion Approach between Remote Sensing, UAVs and Self-Driving Tractors. Sensors 2022, 22, 7910. [Google Scholar] [CrossRef]
  156. Cui, Z.; Li, K.; Kang, C.; Wu, Y.; Li, T.; Li, M. Plant and Disease Recognition Based on PMF Pipeline Domain Adaptation Method: Using Bark Images as Meta-Dataset. Plants 2023, 12, 3280. [Google Scholar] [CrossRef] [PubMed]
  157. Gomes, J.C.; Borges, L.A.B.; Borges, D.L. A Multi-Layer Feature Fusion Method for Few-Shot Image Classification. Sensors 2023, 23, 6880. [Google Scholar] [CrossRef] [PubMed]
  158. The World-Class Scientific Center “Agrotechnologies of the Future” Site. Available online: https://future-agro.ru/ (accessed on 10 October 2023).
  159. Kolhar, S.; Jagtap, J. Plant trait estimation and classification studies in plant phenotyping using machine vision—A review. Inf. Process. Agric. 2023, 10, 114–135. [Google Scholar] [CrossRef]
  160. The Doka-Gennyye Tekhnologii Company Site. Available online: https://dokagene.ru/ (accessed on 10 October 2023).
  161. Sinshaw, N.T.; Assefa, B.G.; Mohapatra, S.K.; Beyene, A.M. Applications of Computer Vision on Automatic Potato Plant Disease Detection: A Systematic Literature Review. Comput. Intell. Neurosci. 2022, 2022, 7186687. [Google Scholar] [CrossRef] [PubMed]
  162. Polder, G.; Blok, P.M.; de Villiers, H.A.C.; van der Wolf, J.M.; Kamp, J. Potato Virus Y Detection in Seed Potatoes Using Deep Learning on Hyperspectral Images. Front. Plant Sci. 2019, 10, 209. [Google Scholar] [CrossRef]
  163. Martin, J.; Ansuategi, A.; Maurtua, I.; Gutierrez, A.; Obregón, D.; Casquero, O.; Marcos, M. A Generic ROS-Based Control Architecture for Pest Inspection and Treatment in Greenhouses Using a Mobile Manipulator. IEEE Access 2021, 9, 94981–94995. [Google Scholar] [CrossRef]
  164. Tiwari, S.; Zheng, Y.; Pattinson, M.; Campo-Cossio, M.; Arnau, R.; Obregon, D.; Ansuategui, A. Approach for Autonomous Robot Navigation in Greenhouse Environment for Integrated Pest Management. In Proceedings of the IEEE/ION Position, Location and Navigation Symposium (PLANS), Portland, OR, USA, 20–23 April 2020; pp. 1286–1294. [Google Scholar] [CrossRef]
Figure 1. (A) An artist’s depiction of satellites orbiting Earth [73]. (B) Planet.com active and planned satellites and their characteristics [77].
Figure 1. (A) An artist’s depiction of satellites orbiting Earth [73]. (B) Planet.com active and planned satellites and their characteristics [77].
Appliedmath 03 00043 g001
Figure 2. The Yield app interface showcasing NDVI and notes of a field [80].
Figure 2. The Yield app interface showcasing NDVI and notes of a field [80].
Appliedmath 03 00043 g002
Figure 3. The UAV Geoscan lite (A) and multispectral camera (B) proposed by Geoscan [88].
Figure 3. The UAV Geoscan lite (A) and multispectral camera (B) proposed by Geoscan [88].
Appliedmath 03 00043 g003
Figure 4. Drone working route (A) and detection results of a model (B) in the AssistAgro software interface [89].
Figure 4. Drone working route (A) and detection results of a model (B) in the AssistAgro software interface [89].
Appliedmath 03 00043 g004
Figure 5. Tractor cabin with the Cognitive Technologies hardware (A) and visualization of the camera view from the Cognitive Technologies auto-pilot (B) [114].
Figure 5. Tractor cabin with the Cognitive Technologies hardware (A) and visualization of the camera view from the Cognitive Technologies auto-pilot (B) [114].
Appliedmath 03 00043 g005
Figure 6. Harvesting robot drone by Tevel [120] (A), MetoMotion tomato harvesting robot [121] (B), and strawberry-picking robot by AgroBot [123] (C).
Figure 6. Harvesting robot drone by Tevel [120] (A), MetoMotion tomato harvesting robot [121] (B), and strawberry-picking robot by AgroBot [123] (C).
Appliedmath 03 00043 g006
Figure 7. (A) Novosibirsk iFarm-powered vertical farm [129] and (B) GALAD Green Line-powered farm in the Moscow restaurant Greenhouse [130].
Figure 7. (A) Novosibirsk iFarm-powered vertical farm [129] and (B) GALAD Green Line-powered farm in the Moscow restaurant Greenhouse [130].
Appliedmath 03 00043 g007
Figure 8. The DoctorP app interface demonstrating the overarching application logic [70].
Figure 8. The DoctorP app interface demonstrating the overarching application logic [70].
Appliedmath 03 00043 g008
Figure 9. (A) The image captured by the tractor camera and processed by the model for identifying potato diseases. (B) A visualization highlighting the differences in a specially selected index between potatoes affected by the Y virus (top two images) and healthy potatoes (bottom two photos).
Figure 9. (A) The image captured by the tractor camera and processed by the model for identifying potato diseases. (B) A visualization highlighting the differences in a specially selected index between potatoes affected by the Y virus (top two images) and healthy potatoes (bottom two photos).
Appliedmath 03 00043 g009
Figure 10. (A) Output of the cucumber top counter model. (B) Output of the cucumber problem detection model. (C) Evaluation of the lifting mechanism prototype. (D) Full-scale robot prototype.
Figure 10. (A) Output of the cucumber top counter model. (B) Output of the cucumber problem detection model. (C) Evaluation of the lifting mechanism prototype. (D) Full-scale robot prototype.
Appliedmath 03 00043 g010
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Uzhinskiy, A. Advanced Technologies and Artificial Intelligence in Agriculture. AppliedMath 2023, 3, 799-813. https://doi.org/10.3390/appliedmath3040043

AMA Style

Uzhinskiy A. Advanced Technologies and Artificial Intelligence in Agriculture. AppliedMath. 2023; 3(4):799-813. https://doi.org/10.3390/appliedmath3040043

Chicago/Turabian Style

Uzhinskiy, Alexander. 2023. "Advanced Technologies and Artificial Intelligence in Agriculture" AppliedMath 3, no. 4: 799-813. https://doi.org/10.3390/appliedmath3040043

Article Metrics

Back to TopTop