Next Article in Journal
Territorial Social Innovation and Alternative Food Networks: The Case of a New Farmers’ Cooperative on the Island of Ibiza (Spain)
Previous Article in Journal
Effects of Climate Warming on the Potential Northern Planting Boundaries of Three Main Grain Crops in China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Nondestructive Detection Method for the Calcium and Nitrogen Content of Living Plants Based on Convolutional Neural Networks (CNN) Using Multispectral Images

by
Grzegorz Kunstman
1,*,
Paweł Kunstman
1,
Łukasz Lasyk
1,
Jacek Stanisław Nowak
2,
Agnieszka Stępowska
2,
Waldemar Kowalczyk
2,
Jakub Dybaś
3 and
Ewa Szczęsny-Małysiak
3
1
Active Text, 30-519 Krakow, Poland
2
The National Institute of Horticultural Research, 96-100 Skierniewice, Poland
3
Jagiellonian Centre for Experimental Therapeutics, Jagiellonian University, 30-348 Krakow, Poland
*
Author to whom correspondence should be addressed.
Agriculture 2022, 12(6), 747; https://doi.org/10.3390/agriculture12060747
Submission received: 22 March 2022 / Revised: 25 April 2022 / Accepted: 29 April 2022 / Published: 25 May 2022

Abstract

:
Herein, we present the novel method targeted for determination of plant nutritional state with the use of computer vision and Neural Networks. The method is based on multispectral imaging performed by an exclusively designed Agroscanner and a dedicated analytical system for further data analysis with Neural Networks. An Agroscanner is a low-cost mobile construction intended for multispectral measurements at macro-scale, operating at four wavelengths: 470, 550, 640 and 850 nm. Together with developed software and implementation of a Neural Network it was possible to design a unique approach to process acquired plant images and assess information about plant physiological state. The novelty of the developed technology is focused on the multispectral, macro-scale analysis of individual plant leaves, rather than entire fields. Such an approach makes the method highly sensitive and precise. The method presented herein determines the basic physiological deficiencies of crops with around 80% efficiency.

1. Introduction

Agriculture is an economic sector that relies on many variables that cannot be easily controlled (i.e., temperature, solar radiation, soil properties). Unfavorable growing conditions can significantly change the productivity of a field in a short time period [1]. Over the years, with the growth of large-scale agriculture, a need for a quick and reliable assessment of plant physiological state emerged.
Physiological condition reflects the yield potential of plants, which is significantly influenced by climatic conditions, soil fertilization and plant nutrition. Symptoms caused by unfavorable climatic conditions and improper fertilization are often observed in plants. These are related to incorrect photosynthesis and nutrient deficiencies or excesses. Precise assessment of the physiological state of plants conducted mainly on the basis of visual diagnostics (observation of plants and identification of alarming symptoms) or chemical analysis of the substrate, nutrient solutions and plants is not a fully effective method. Accurate assessment of the condition of plants in relation to their nutritional and climatic requirements made with the use of a scanner will increase the efficiency of the system of plant production advisory by rationalizing feeding strategies and optimizing plant production conditions. Thus, it will make it possible to optimize production, and precise diagnostics allowing to conduct rational fertilization will also influence environmental and social aspects.
Different diagnostic methods are used to assess the physiological state of plants, but all are based on chemical analysis of soil and plants and observation of plant appearance, which significantly extends the determination of the cause of occurring symptoms. The subject matter of this research project, a multispectral method of determining the nutritional status of plants, gives the possibility for a quick assessment of the physiological state of plants, allowing measurements and presentations of agrotechnical recommendations in the real conditions of cultivation of a particular plant, which is extremely important for large areas of crops. Therefore, it allows one to quickly verify the problems associated with feeding plants, thereby reducing the incidence of diseases and pests on plants.
Optimizing agricultural production with a cheap and reliable multispectral method of determining the nutritional status of plants will correct errors in fertilization and plant nutrition while reducing the use of crop protection products. It provides opportunities for precise plant fertilization and application of plant protection products, which will increase the resistance of plants to biotic (pathogens) and abiotic (climate) stress. Fast diagnosis and proper nutrition of plants will improve the quantity and quality of obtained crops and increase the nutritional value of plants, while reducing the amount of harmful substances (pesticides, nitrates, heavy metals) and therefore having a positive impact on consumer safety. It will also eliminate excessive feeding of plants, which will have a positive impact on limiting the use of fertilizers and thus reduce soil and groundwater contamination with components originating from fertilizers.
Sustainable growth and development of plants means that their metabolism takes place at the highest energy level that the plant can achieve under current environmental conditions. This means that the products of photosynthesis and carbon assimilation are efficiently distributed in the plant, which then produces active green matter and the most valuable generative organs. These interrelationships are used in crop production for maximum yield with high commercial quality. However, due to often unfavorable climatic conditions both in the field and under cover production, the natural physiological processes of the plants are disrupted and they do not achieve maximum yield and quality. In traditional plant production, the remedy is soil fertilization and plant nutrition conducted on the basis of chemical analyses of the substrate, nutrients and plants, and often only on the basis of visual diagnosis (observation of plants and identification of alarming symptoms). However, the results of chemical analyses reflect only the condition of the sample at the time of sampling. It is also not possible to assess from them what caused any disorders or how much they modified the entire plant organism. Visual diagnosis is subject to the “error of knowledge and skill” of the person making the observations. These aspects cause crops to be grown according to a “human point of view” and do not take into account the actual condition and requirements of the plants. This results in the use of excessive or ill-balanced doses of fertilizers, biostimulants or even plant protection products (susceptibility to disease infections and pest activity is often dependent on plant vigor). The improper use of such inputs is a burden on the environment and generates problems with the biological quality of plant products and even their safety for consumers.
Classical spectroscopic methods provide a single spectrum per measurement comprising overall chemical information from the whole sample volume, while hyperspectral imaging provides thousands of spectra combined into a spectral image where each pixel corresponds to one spectrum. This approach enables obtaining not only the chemical signature of a given material but also a correlation of chemical information with its precise localization within the sample [2]. The application of hyperspectral imaging varies from a micro-scale or even molecular level in biomedicine and biological samples to landscape or field scale in agriculture, geology and astronomy [2]. In agriculture, it was first used to determine physical properties such as leaf size or leaf area index [2]. Further development of this technique allowed for differentiation between various plants based, e.g., on pigment concentration for monitoring plant growth and photosynthetic productivity potential, fertilization efficiency, and finally for detection of the early onset of plant diseases [2,3].
To avoid the loss of crops and gain maximum yield, different methods of remote sensing (i.e., RGB, near-infrared (NIR), far-infrared (FIR; thermal), lidar, multi- and hyper-spectral) were introduced for detection of early signs of malnutrition or disease in plants [4,5]. Near-infrared (NIR) spectroscopy has been known in agriculture since the 1960s [2,6], but since it lacks spectral range and precision, it has been combined with other techniques to become multispectral imaging [2]. Apart from NIR, broadbands detected by the multispectral sensors comprise RGB, ultraviolet light, red edge and/or thermal bands [4]. A combination of these can give voluminous information regarding plant nutritional status, and thus be helpful in crop maintenance.

1.1. Elements and Their Function

Two elements that play a vital role in plant growth and nutritional status are nitrogen (N) and calcium (Ca). An appropriate amount of available nitrogen is the key to ensuring maximum plant growth. It plays a role in the growth rate of leaves and fruit quality. Not only does nitrogen deficiency limit plant growth more often than the lack of any other plant nutrient, additionally, plants exposed to its excess are unable to develop roots properly and turn brown and wither [7].
Plant growth and fruit development also strongly depend on the calcium supply. Calcium is a macronutrient required for cell elongation and cell division [8], activation of enzymes involved in biosynthesis [9], and regulation of many metabolic functions [10]. Ca deficiencies are first seen in the youngest parts of the plant (i.e., young leaves), where the reduction in growth rate is followed by deformations, and chlorosis leading to necrosis [11]. Interestingly, Ca deficiency often occurs in soils rich in this element [12]. This may happen when low temperature limits Ca transport in roots [13]. Unfortunately, these problems can only be identified at the time of the onset of symptoms, while the disease develops earlier, in a way imperceptible to the naked eye. Thus, it seems advisable to develop a remote sensing method that allows for routine check-ups of plant nutritional status at the cultivation site.

1.2. Neural Networks and Computer Vision

A Convolutional Neural Network (CNN) is a class of networks, most commonly applied in computer image processing. CNN is a Deep Learning algorithm designed to automatically and adaptively learn spatial hierarchies of features. For a few years, CNNs have been dominant in various computer vision systems. They have excellent results in video processing, object recognition, picture classification and segmentation [14].
According to the publication “A review of the use of convolutional neural networks in agriculture” from The Journal of Agricultural Science in 2018, at that time only 23 research efforts employing CNN existed for addressing various agricultural problems [15]. This shows that the field of image analysis using artificial intelligence in agriculture is a very new issue. Neural Networks and computer vision have become increasingly important for precision agriculture, which is why more and more publications on this topic have appeared in the last three years [16,17,18].

2. Methodology

The present study aimed to develop technology enabling the assessment of the physiological state and nutrition of plants based on multispectral imaging using Agroscanner and a dedicated IT analytical system for further data analysis based on Neural Networks.
To achieve this goal, we constructed an innovative low-8ju mobile device, i.e., Agroscanner, based on multispectral measurements at a macro-scale (1 pixel less than 1mm2 of a scanned plant). The device was tested on reference crops which were used to simulate deficiencies in fertilization that may occur in real crops, divided into 5 fertilizer groups. The device is supported with an IT analytical system created to achieve automatic analysis of the collected measurements. The main goal of the system was to determine the correct fertilizer group of a particular measurement. Based on that information, farmers can adjust fertilization schemes to maximize future harvests.

2.1. Agroscanner

One of the challenges of the project was to create an optical system that would allow efficient performance of multispectral measurements in field conditions. The system had to take into account different requirements:
-
Variable optical measurement conditions;
-
The ability to perform measurements in light winds;
-
Necessity of recording parameters such as temperature, humidity and GPS position;
-
Low manufacturing cost.
During the project, 4 prototypes of the optical system were created, from a prototype based on a single camera with a filter turret (used in astrophotography) to a lightweight system of 4 spectral cameras (Figure 1). During the development of the prototypes, different approaches to lighting the measurements were tested, from artificial lighting with daylight blocking to a system using daylight with advanced image normalization techniques. Each change in the optical system forced a change in the algorithms for normalization and analysis of the measurements.
Finally, the Agroscanner is a device constructed by Active Text. Krakow, Poland designed to perform multispectral images based on four simultaneously working spectral cameras—allowing for the collection of 19 images per second. The Agroscanner comprises a Basler daA1600-60um (CS-Mount) camera with a 1/1.8” Teledyne e2v EV76C570 CMOS image sensor providing mono-color images at 60 fps in 2MP resolution (1600 × 1200 px). The device is designed to capture crop images on a macro-scale: one pixel corresponds to less than 1 mm2 of a real plant. This approach enabled us to build an IT analytical system based on the features of both plants and scans (e.g., soil mask-out, leaf vein detection, shape detection, etc.), which was impossible with the currently used high-attitude drone crop scanning.
Each of four cameras is equipped with the following 10 nm Narrow-Band Filter: 470, 550, 640 and 850 nm. Apart from cameras, Agroscanner is equipped with sensors enabling to control both external conditions and geolocation:
  • Temperature sensor—(“AirTemperature”);
  • The first light sensor—(“AmbientLight”);
  • Second light sensor—(“SurfaceAmbientLight”);
  • UV sensor—(“UV”);
  • GPS—(“Gps”);
  • Humidity sensor—(“AirHumidity”);
  • Compass—(“SurfaceCompass”);
  • Accelerometer—(“SurfaceAccelerometer”);
  • Gyroscope—(“SurfaceGyrometer”).
The whole unit is controlled by a central computer based on an Intel i5 CPU, to be able to perform preliminary computer vision analysis. The device is battery powered, allowing up to 60 min of device work.
The final prototype is the outcome of a four-year-long research project and has multiple versions. In most cases, the Agroscanner is used as a handheld device, but the company has also created a drone version.

2.2. Reference and Experimental Crops

The crops were chosen based on their properties, regarding their specific demand for elements and fertilizers and their growth period. The project was focused on cauliflower (Brassica oleracea L. var. botrytis L.)—varieties Gohan F1, Cabral F1 and David F1.
One of the main parts of the project was reference crops grown under specific conditions designed to simulate defined deficiencies in fertilization that may occur in real crops. The following fertilizer groups were defined:
Standard medium (compliant with the nutritional requirements of cauliflower seedlings) with the content of macronutrients: N-NO3 114 mg/L, P 44 mg/L, K 230 mg/L, Mg 49 mg/L and Ca 126 mg/L, Cl 13 mg/L, S-SO4 102 mg/L and the full spectrum of micronutrients;
  • medium with an increased amount of N 186 mg/L (including N-NO3 160 mg/L, N-NH4 26 mg/L);
  • medium with a reduced amount of N 60 mg/L with an increased amount of Ca 208 mg/L;
  • medium with a reduced amount of Ca 88 mg/L.
Crops were monitored by agronomists during the multispectral measurements with the Agroscanner. Additionally, both soil and leaf chemical analyses were conducted simultaneously. Along with the project, cauliflower was cultivated in phytotrons and greenhouses located in the Research Institute of Horticulture, Skierniewice, Poland, and in arable fields scattered over three geographically distant locations: the community of Igolomia-Wawrzenczyce, Malopolska Voivodeship, Poland; community of Pacanow, Swietokrzyskie Voivodeship, Poland; and community of Obrazow, Swietokrzyskie Voivodeship, Poland. Arable fields enabled us to develop normalization and image morphing algorithms in various weather conditions. On the other hand, greenhouses crops which had much more supervised conditions were used to build the final analytical system.
In greenhouses, seedlings were grown in phytotron model 730 DD INOX from Biosell, allowing precise control of temperature (day: +10 °C to +40 °C; night: +3 °C to +40 °C), humidity (30% to 90% RH) and light intensity (LED lamps). Cauliflower seeds were sown into mini rockwool cubes (AO block) soaked in water and placed in vegetation chambers—phytotrons, under optimal climatic conditions (humidity 70–75%, temperature 22 °C, light period 12 h). After germination, seedlings were placed in rockwool seedling cubes (10 × 10 × 7.5 cm) soaked in standard nutrient solution and re-inserted into vegetation chambers, gradually changing the temperature to that appropriate for the given growth stage, i.e., from 20 °C after picking to 16 °C at the stage of at least 4 leaves. After rooting the quilt, each of the objects was fed with a nutrient solution with specific N and Ca content. Plants were fertigated using the underseeded method.
Observations and measurements were made at intervals of 2–3 and 7 days, starting from the stage of 2 proper leaves (i.e., about a week after the introduction of differentiated nutrient solution) until obtaining seedlings that were ready for planting (4 weeks). Next, ready seedlings were transferred to the greenhouse for further cultivation.

2.3. Crop Monitoring

Observations and measurements can be divided into two categories:
Ongoing monitoring process—observations made every 2–3 and every 7 days, starting with the appearance of two true leaves (ca. one week after the introduction of diversified medium) until the seedling was ready to be planted (ca. 4 weeks).
Monitoring during Agroscanner measurement—soli and leaf samples were taken during Agroscanner measurements for subsequent chemical analysis.

2.4. IT Analytical System

Before each series of measurements, the device was calibrated, to check the operation of all cameras, image sharpness, the relative position of images recorded by different cameras and the operation of additional sensors. The calibration page was printed on a carbon-based tonner, so images look almost the same in all wavelengths, which makes it much easier to analyze and calibrate. Figure 2 shows the images captured by all four cameras. Due to the translation of the photos, calibration images were recorded to enable the analysis of acquired data.
Each plant was photographed from different perspectives, and the total number of images of one object equaled 30, each of them comprising 4 corresponding pictures from each camera (captured at 470, 550, 640 and 850 nm wavelength). After measurement completion, recorded data were automatically sent to Active Text servers.
Each measurement consists of 4 greyscale images (one for each wavelength) of 1600 × 1200 pixels (Figure 3).
Each multispectral measurement was processed using the following algorithm:
Cropping images: Active Text Agroscanner takes 4 spectral images at once using 4 cameras. Each of the images is shifted in relation to the others so as to be able to proceed with the next steps—a common part of the images must be designated.
Image quality assurance: Agroscanner has two implementations, handheld and drone installed. Both implementations are used in motion (limited). Rejected images comprise those captured out of focus, too dark, those comprising the shadow of a drone or Agroscanner operator, or too bright, as the light conditions changed during the image series (light calibration was performed only at the beginning of a measurement series).
Three-dimensional image morphing: The way the cameras were placed makes perspective slightly shifted. Further analysis requires that perspective must be in sync between every multispectral measurement. This was a very important step. Morphing calculations were complicated because each band was different (especially the IR band), so classic photogrammetry algorithms do not apply here.
Soil recognition: Using multiple techniques (also using Neural Network models), soil was detected in images and cut out from further processing.
Leaf and vein recognition: Using multiple techniques (also using Neural Network models), leaf veins were detected. Some physiological indexes were calculated for different plant areas (e.g., vein).
Image normalization: Two normalization methods were created during the project. One was to multiply the measurements by a factor calculated from the ratio of the average measurement value in the vein area to the rest of the leaf. The other was to use a calibration plate and adjust the parameter measurements while taking a series of images. The design of the calibration plate as well as the scanner itself allows for real-time measurement analysis of lighting conditions (Figure 4).
Physiological index calculation: For each pixel in the synchronized, 3D-morphed and normalized multispectral set we calculated a set of 18 physiological indexes.
  • ATIndexATLeaf—Active Text normalized index based on leaf detection algorithm;
  • ATIndexATVein—Active Text normalized index based on vein detection algorithm;
  • ATIndexCI_RE—Chlorophyll index—red edge;
  • ATIndexCI_G—Chlorophyll index–green;
  • ATIndexCVI—Chlorophyll vegetation index;
  • ATIndexEVI—Enhanced vegetation index;
  • ATIndexGLI—Green leaf index;
  • ATIndexgNDVI—Green normalized difference vegetation index;
  • ATIndexMSAVI—MSAVI index;
  • ATIndexMTVI2—Second modified triangular vegetation index
  • ATIndexNGRDI—Normalized green–red difference index;
  • ATIndexOSAVI—Optimized soil adjusted vegetation index;
  • ATIndexSAVI—Soil adjusted vegetation index, also known as MSAVI;
  • ATIndexTVI—Triangular vegetation index;
  • ATIndexVARI—Visible atmospherically resistant index;
  • ATIndiciesNDVI—NDVI index;
  • ATIndiciesSIMPLE—SIMPLE index;
  • ATIndexATLeafVeinRatio—Active Text normalized index-based vein to leaf index ratio.
Each pixel in the multispectral set (with spectral images) was transformed by calculation of indexes into an 18-dimensional vector.
  • Physiological Index Analysis: Final goal is to determine which fertilizer group a measured plant belongs to. Therefore, multiple classifiers were built. Input classifiers have an 18-dimension image, as does the output fertilizer group. During the whole project we were testing the following methods:
    • Statistical methods: Based on data sets gathered during the project we calculated focus points for every fertilizer group. The created metric can define the nearest focus point for each new vector, hence the fertilizer group.
    • Neural Network method: We tested multiple models—custom-built dense Neural Network model, KNN Classification and CNN model.
A synthetic version of the algorithm is shown in the Figure 5.

3. Results

As part of the project, approx. 30,000 multispectral measurements were performed (more than 120,000 single images). During three years of measurements, the Research and Development Team worked on the hardware, Agroscanner, statistical and analytical algorithms. The final measurement batch was used to test the last version of the system. The main research task was to identify the correct fertilizer group of scanned plants using the Agroscanner. Two approaches were used:
Statistical approach—based on defining a proper subset of calculated indexes and metrics to be able to define the focal point for each fertilizer group.
AI approach—based on classifying Convolutional Neural Networks.

3.1. Statistical Approach

The statistical approach is to look for cluster points in a multidimensional space where each measurement corresponds to an averaged vector of indices. Averaging of indices for measurement takes place only on the green leaf surface (other artifacts from the measurement such as soil are rejected). For each fertilizer group, we looked for a focal point so that as many measurements as possible are located in its proximity. We can visualize this on a two-dimensional plane using the TSNE (t-Distributed Stochastic Neighbor Embedding) method, which reduces the dimensions of a vector to a two-dimensional vector. The blue color signed as zero represents samples with nitrogen deficiency, while the orange color signed as three symbolizes samples without nitrogen deficiency (Figure 6).
In both cases, there are obvious focal points between the two groups. Nevertheless, in both cases, some measurements are in the wrong surroundings.
We tested multiple statistical approaches to the cluster vector set (Nearest Neighbors—KNN, ExtraTreesClassifier, etc.). The best results gave us a Dens four-layer Neural Network classifier with recognition quality at 71%.

3.2. AI Approach

To be able to achieve the best recognition rate of the fertilizer group, a proper model needed to be used. To achieve that, we set up an extensive Machine Learning process. Using an expanded cluster of multiple AI machines we were able to choose the best CNN model. During that process, we tested multiple models’ designs (test results in Table 1). As a result of this process, the best model was selected. The selected classifier CNN model has a total of 14 layers and 3 convolutional layers. It has a total of 8,978,181 neurons. As an input, it has a 280 × 280 pixels image and as an output, it has 5 fertilizer groups.

3.3. Final Test

The final test was conducted on the greenhouse-grown cauliflower crops. Measurements were conducted with the use of the Agroscanner on fully grown plants in 5- to 14 day-long intervals. On each measurement day, a sample of a green part of each fertilizer class was taken for the chemical analysis. During the test, the previously trained final model showed recognition accuracy of 86.93% of a single recognition sample (Table 2).
To increase the recognition ratio, we can assume that plants on neighboring measurements belong to a similar fertilizer group. We predict that by using such an assumption we can increase the outcome by 5%. However, due to too small reference crop fields, calculations necessary to check the accuracy of this procedure were not possible.

4. Discussion

The unique construction of the Agroscanner allows measurements of plant tissues based on light transmittance and absorbance (multispectral measurements) with a high-accuracy assessment of their physiological condition. The novelty of the technology developed by Active Text in the course of R&D works is based on multispectral, macro-scale analysis [1] of individual plant leaves, rather than entire fields (even when scanning from the air), which makes the method much more sensitive and precise.
One of the biggest challenges for measurements performed on an arable field is the calibration of measurements. It enables a comparative analysis between values of indices calculated from the measurements collected at different periods, times of the day or in different locations. The lack of universal calibration methods is one of the most important barriers to the development of multispectral measurements in agriculture. Therefore, the measurements and their automatic calibration performed in the fields were so important for the project. As a result, a procedure for performing measurements and a calibration method that allows for leveling the measuring factor related to the variability of external measurement conditions (in particular lighting conditions) were developed. This method consists of the following elements: appropriate design of the optical module that allows recording of spectral images with high sensitivity and speed (key for measurements taken from a short distance), an analytical system based on computer image analysis that detects leaf elements, and a system for physiological comparative analysis of leaf element indices.
Two analytical approaches were tested: a statistical method and an AI-based method. The statistical approach to analyze indexes in a comparative way within the elements of a particular leaf of a given plant is innovative. This approach has not yet been used in the agricultural industry because it reverses the way of thinking about indices. Instead of focusing on calibrating spectral images to enable their comparison, we constructed new indices based on a comparison of the values measured within one measurement. Our research has shown that examining the ratio of indices calculated on the area of veins and the rest of the leaf gives important information about the physiological state of the plant. The AI-based approach is particularly innovative because it does not focus on calculating indexes, but leaves whole calculations to the Deep Neural Network.
Considering that the method is to be used on large-scale field crops it is important that the analytical algorithm can cope with very different lighting conditions. Our research has shown that the approach using Convolutional Neural Networks has a higher recognition quality. This is not surprising, since Convolutional Neural Networks can learn more complex relationships than computing indices. These Neural Networks may pay attention to more than just the level of transmittance and absorbance of light (in terms of value), but also more subtle features such as leaf or vein shape. On the other hand, using the Neural Network approach has all the problems associated with the Machine Learning process. It requires a large number of statistically representative learning bases, and also debugging is very difficult. Nevertheless, data are the key.

5. Conclusions

The conducted research allowed the development of a technology that, with around 80% efficiency, determines the basic physiological deficiencies of crops. Such knowledge allows the use of precision farming on a much larger scale. What is more, because the constructed Agroscanner is based on relatively cheap components, the cost of using the technology itself is small compared to the benefits.
In future, this macro multispectral analysis will allow automatic detection of lesions, parasites, pests and other stress factors at an early stage of their occurrence, which grants the possibility to remove the causes of plant stress when it is still possible.

Author Contributions

Conceptualization: G.K., P.K., Ł.L., J.S.N., A.S.; Methodology: G.K., J.S.N., A.S., W.K.; Software: G.K., P.K., Ł.L.; Writing—original draft preparation: G.K., P.K., J.S.N., A.S., J.D., E.S.-M. All authors have read and agreed to the published version of the manuscript.

Funding

The project was co-financed by the Polish National Centre for Research and Development under the program “Environment, Agriculture and Forestry”, grant No: BIOSTRATEG2/298549/6/NCBR/2016, and by the European Regional Development Fund under the Regional Operational Programme for the Lubelskie Voivodeship 2014–2020, grant No: RPLU.01.02.00-06-0001/18-00.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The following authors have affiliations with organizations with direct or indirect financial interest in the subject matter discussed in the manuscript: G.K.—co-owner of company Active Text Sp. z o.o.

References

  1. Mahlein, A.K.; Steiner, U.; Hillnhütter, C.; Dehne, H.W.; Oerke, E.C. Hyperspectral imaging for small-scale analysis of symptoms caused by different sugar beet diseases. Plant Methods 2012, 8, 3. [Google Scholar] [CrossRef] [Green Version]
  2. Dale, L.M.; Thewis, A.; Boudry, C.; Rotar, I.; Dardenne, P.; Baeten, V.; Pierna, J.A.F. Hyperspectral Imaging Applications in Agriculture and Agro-Food Product Quality and Safety Control: A Review. Appl. Spectrosc. Rev. 2013, 48, 142–159. [Google Scholar] [CrossRef]
  3. Lowe, A.; Harrison, N.; French, A.P. Hyperspectral image analysis techniques for the detection and classification of the early onset of plant disease and stress. Plant Methods 2017, 13, 80. [Google Scholar] [CrossRef]
  4. Pádua, L.; Vanko, J.; Hruška, J.; Adão, T.; Sousa, J.J.; Peres, E.; Morais, R. UAS, sensors, and data processing in agroforestry: A review towards practical applications. Int. J. Remote Sens. 2017, 38, 2349–2391. [Google Scholar] [CrossRef]
  5. Boss, C.B.; Fredeen, K.J. Concepts, Instrumentation and Techniques in Inductively Coupled Plasma Optical Emission Spectrometry, 2nd ed.; Perkin Elmer: Waltham, MA, USA, 1999; Available online: http://www.perkinelmer.co.uk/CMSResources/Images/44-159043GDE_Concepts-of-ICP-OES-Booklet.pdf (accessed on 7 July 2021).
  6. Hart, J.R.; Norris, K.H.; Golumbic, G. Determination of the moisture content of seeds by near-infrared spectrophotometry of their methanol extracts. Cereal Chem. 1962, 39, 94–99. [Google Scholar]
  7. Day, A.D.; Ludeke, K.L. Nitrogen as a Plant Nutrient. In Plant Nutrients in Desert Environments; Springer: Berlin/Heidelberg, Germany, 1993; pp. 39–43. [Google Scholar] [CrossRef]
  8. Burstrom, H.G. Calcium and Plant Growth. Biol. Rev. 1968, 43, 287–316. [Google Scholar] [CrossRef]
  9. Brumell, D.A.; Maclachlan, G.A. Calcium Antagonist TMB-8 Inhibits Cell Wall Formation and Growth in Pea. J. Exp. Bot. 1989, 40, 559–565. [Google Scholar] [CrossRef]
  10. Mengel, K.; Kirkby, E.A. Principles of Plant Nutrition, 5th ed.; Springer: Amsterdam, The Netherlands, 2005. [Google Scholar] [CrossRef]
  11. Torres-Olivar, V.; Villegas-Torres, O.G.; Dominguez-Patino, M.L.; Sotelo-Nava, H.; Rodriguez-Martinez, A.; Melgoza-Aleman, R.M.; Valdez-Aguilar, L.A.; Alia-Tejacal, I. Role of Nitrogen and Nutrients in Crop Nutrition. J. Agric. Sci. Technol. B 2014, 4, 29–37. [Google Scholar]
  12. Bangerth, F. Calcium-Related Physiological Disorders of Plants. Annu. Rev. Phytopathol. 1979, 17, 97–122. [Google Scholar] [CrossRef]
  13. Scaife, M.A.; Clarkson, D.T. Calcium-Related Disorders in Plants—A Possible Explanation for the Effect of Weather. Plant Soil 1978, 50, 723–725. [Google Scholar] [CrossRef]
  14. Bhatt, D.; Patel, C.; Talsania, H.; Patel, J.; Vaghela, R.; Pandya, S.; Modi, K.; Ghayvat, H. CNN Variants for Computer Vision: History. Architecture, Application, Challenges and Future Scope. Electronics 2021, 10, 2470. [Google Scholar] [CrossRef]
  15. Kamilaris, A.; Prenafeta-Boldú, F.X. A review of the use of convolutional neural networks in agriculture. J. Agric. Sci. 2018, 156, 312–322. [Google Scholar] [CrossRef] [Green Version]
  16. Xu, R.; Li, C.; Paterson, A.H. Multispectral imaging and unmanned aerial systems for cotton plant phenotyping. PLoS ONE 2019, 14, e0205083. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Seydi, S.T.; Amani, M.; Ghorbanian, A. A Dual Attention Convolutional Neural Network for Crop Classification Using Time-Series Sentinel-2 Imagery. Remote Sens. 2022, 14, 498. [Google Scholar] [CrossRef]
  18. Srivastava, A.K.; Safaei, N.; Khaki, S.; Lopez, G.; Zeng, W.; Ewert, F.; Gaiser, T.; Rahimi, J. Winter wheat yield prediction using convolutional neural networks from environmental and phenological data. Sci. Rep. 2022, 12, 3215. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Agroscanner generations created during the research project.
Figure 1. Agroscanner generations created during the research project.
Agriculture 12 00747 g001
Figure 2. Agroscanner camera geometry calibration on carbon-based (laser printout) calibration plate. A carbon-based calibration plate provides similar images for every spectral image from all cameras.
Figure 2. Agroscanner camera geometry calibration on carbon-based (laser printout) calibration plate. A carbon-based calibration plate provides similar images for every spectral image from all cameras.
Agriculture 12 00747 g002
Figure 3. Agroscanner sample measurement from 4 spectral cameras—each spectral resolution 1400 × 1200 pixels, spectral lengths: 470, 550, 850 and 640.
Figure 3. Agroscanner sample measurement from 4 spectral cameras—each spectral resolution 1400 × 1200 pixels, spectral lengths: 470, 550, 850 and 640.
Agriculture 12 00747 g003
Figure 4. Image normalization using calibration plate.
Figure 4. Image normalization using calibration plate.
Agriculture 12 00747 g004
Figure 5. Multispectral measurement analytical algorithm.
Figure 5. Multispectral measurement analytical algorithm.
Agriculture 12 00747 g005
Figure 6. Sample statistical clustering of a series of measurements. Each dot represents one Agroscanner measurement. A 16-dimension index measurement vector was converted to 2-dimensional space using T-Distributed Stochastic Neighbor Embedding algorithm. Red and green spaces represent two different clusters for two different fertilizer groups.
Figure 6. Sample statistical clustering of a series of measurements. Each dot represents one Agroscanner measurement. A 16-dimension index measurement vector was converted to 2-dimensional space using T-Distributed Stochastic Neighbor Embedding algorithm. Red and green spaces represent two different clusters for two different fertilizer groups.
Agriculture 12 00747 g006
Table 1. Testing of different Neural Network models with corresponding recognition ratio.
Table 1. Testing of different Neural Network models with corresponding recognition ratio.
#.Model IdRecognition %Number of NeuronsModel Type
1model-cnn-raw-109-1-100-0.83-0.86.hdf586.15%8,978,181CNN_CLASS
2model-cnn-raw-103-1-97-0.66-0.77.hdf576.627,134,597CNN_CLASS
3model-cnn-morph-01-211-10-0.97-0.97.hdf567.05448,693CNN_CLASS
4model-cnn-morph-01-343-08-0.95-0.98.hdf565.09230,597CNN_CLASS
5model-cnn-morph-01-319-10-0.96-0.97.hdf564.34184,485CNN_CLASS
6model-cnn-morph-01-121-09-0.95-0.98.hdf563.8174,293CNN_CLASS
7model-cnn-morph-01-207-10-0.99-1.00.hdf563.26816,009CNN_CLASS
8model-cnn-morph-01-176-07-0.95-0.98.hdf563.2284,933CNN_CLASS
9model-cnn-raw-01-26-19-0.97-0.98.hdf562.88245,269CNN_CLASS
10model-cnn-raw-01-109-15-0.97-0.98.hdf562.7667,365CNN_CLASS
Table 2. Fertilizer group recognition broken down by measurement day and analytical method. ‘MATCHED AI’—AI approach, ‘MATCHED STAT’—a statistical approach.
Table 2. Fertilizer group recognition broken down by measurement day and analytical method. ‘MATCHED AI’—AI approach, ‘MATCHED STAT’—a statistical approach.
CLASS.MATCHED AIMATCHED STATSAMPLES No.Plant PartNPKCaMg
measurement day: 4594.23%74.42%1421 %mg/kg−1 DM
Variant A—full fertilization N + P + K94.25%78.89%348middle leaves2.27381011,00020,1002720
Variant B—fertilization N + P (without K)95.57%68.57%203middle leaves2.78564011,40024,8003260
Variant C—fertilization N + K (without P)94.64%66.67%261middle leaves2.602530737023,6002980
Variant D—fertilization K + P (without N)94.14%74.00%290middle leaves1.81421013,60015,9002210
Variant E—without fertilization93.10%81.18%319middle leaves1.472980595013,6001960
measurement day: 5876.60%59.22%406
Variant A—full fertilization N + P+K65.52%57.78%87middle leaves1.762970.008820.0017,200.002780.00
Variant B—fertilization N + P (without K)77.01%41.11%87middle leaves2.205000.007430.0020,200.003040.00
Variant C—fertilization N + K (without P)84.48%72.39%116middle leaves2.681960.006040.0024,400.003960.00
Variant D—fertilization K + P (without N)72.41%59.18%58middle leaves1.173090949015,2002100
Variant E—without fertilization81.03%65.52%58middle leaves11820609011,2001850
measurement day: 6582.51%80.71%406
Variant A—standard medium N + P+K83.91%80.00%87middle leaves1.953350.007160.0025,900.003460.00
Variant B—fertilization N + P (without K)81.61%87.78%87middle leaves1.834640.004960.0024,000.003180.00
Variant C—fertilization N + K (without P)81.03%73.33%58middle leaves2.241340518026,6004290
Variant D—fertilization K + P (without N)82.76%68.89%87middle leaves1.052400966016,6002100
Variant E—without fertilization82.76%91.11%87middle leaves0.911480587085502010
measurement day: 7185.15%69.33%377
Variant A—full fertilization N + P+K89.66%68.97%58middle leaves1.512580586023,0003170
Variant B—fertilization N + P (without K)89.66%87.78%87middle leaves2.014940716025,4003800
Variant C—fertilization N + K (without P)74.14%28.33%58middle leaves2.181050491024,8004120
Variant D—fertilization K + P (without N)83.91%62.22%87middle leaves0.94213011,20017,3002130
Variant E—without fertilization86.21%85.56%87middle leaves0.881230687014,9002330
measurement day: 7784.08%56.92%377
Variant A—full fertilization N + P + K77.59%43.33%58middle leaves1.532590589023,6003110
Variant B—fertilization N + P (without K)88.51%55.56%87middle leaves2.075160792026,4004290
Variant C—fertilization N + K (without P)79.31%73.33%58middle leaves2.481130694031,0005740
Variant D—fertilization K + P (without N)86.21%43.33%87middle leaves0.8198012,50021,3002640
Variant E—without fertilization85.06%70.00%87middle leaves0.961320705019,1002910
measurement day: 9084.62%73.85%377
Variant A—full fertilization N + P+K77.01%73.33%87middle leaves1.5267012,50029,9005280
Variant B—fertilization N + P (without K)90.80%70.00%87middle leaves2.4622013,30029,1005210
Variant C—fertilization N + K (without P)77.59%60.00%58middle leaves2.351190804029,7005950
Variant D—fertilization K + P (without N)87.93%85.00%58middle leaves1.01245024,00029,0004250
Variant E—without fertilization88.51%80.00%87middle leaves0.871300996022,2003810
measurement day: 9581.90%78.06%348
Variant A—full fertilization N + P+K77.01%81.11%87middle leaves1.292300895030,9005000
Variant B—fertilization N + P (without K)81.61%81.11%87middle leaves2.046180784035,4006050
Variant C—fertilization N + K (without P)63.79%45.00%58middle leaves1.85728478035,1006160
Variant D—fertilization K + P (without N)94.83%98.33%58middle leaves0.83192016,40026,5003570
Variant E—without fertilization94.83%81.67%58middle leaves0.831040621023,8003570
TOTAL:86.93%71.47%3712
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kunstman, G.; Kunstman, P.; Lasyk, Ł.; Nowak, J.S.; Stępowska, A.; Kowalczyk, W.; Dybaś, J.; Szczęsny-Małysiak, E. Nondestructive Detection Method for the Calcium and Nitrogen Content of Living Plants Based on Convolutional Neural Networks (CNN) Using Multispectral Images. Agriculture 2022, 12, 747. https://doi.org/10.3390/agriculture12060747

AMA Style

Kunstman G, Kunstman P, Lasyk Ł, Nowak JS, Stępowska A, Kowalczyk W, Dybaś J, Szczęsny-Małysiak E. Nondestructive Detection Method for the Calcium and Nitrogen Content of Living Plants Based on Convolutional Neural Networks (CNN) Using Multispectral Images. Agriculture. 2022; 12(6):747. https://doi.org/10.3390/agriculture12060747

Chicago/Turabian Style

Kunstman, Grzegorz, Paweł Kunstman, Łukasz Lasyk, Jacek Stanisław Nowak, Agnieszka Stępowska, Waldemar Kowalczyk, Jakub Dybaś, and Ewa Szczęsny-Małysiak. 2022. "Nondestructive Detection Method for the Calcium and Nitrogen Content of Living Plants Based on Convolutional Neural Networks (CNN) Using Multispectral Images" Agriculture 12, no. 6: 747. https://doi.org/10.3390/agriculture12060747

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop