Smart Farming: Cutting-Edge Technologies and Robots Used in Horticultural Crops

A special issue of Horticulturae (ISSN 2311-7524).

Deadline for manuscript submissions: closed (31 December 2023) | Viewed by 3123

Special Issue Editors

China National Engineering Research Center for Information Technology in Agriculture (Nercita), Haidian District, Beijing, China
Interests: agricultural robotics
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
College of Science, University of Lincoln, Lincoln LN6 7TS, UK
Interests: machine learning; hyperspectral imaging; agricultural engineering
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Faculty of Science and Technology, Norwegian University of Life Sciences, 1433 Ås, Norway
Interests: modeling and control; visual servoing; agricultural robotics; precision agriculture
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The world will need 70% more food, as measured by calories, to feed a global population of 9.6 billion in 2050, according to a report released by the United Nations. This will require farmers to use the same land to produce food more accurately and efficiently. Horticultural production is among the most labor-intensive agricultural systems. Smart farming and advanced robotics technology are likely to address these challenges over the coming decades by helping to optimize plant care and reduce the time humans spend on time-consuming and repetitive farming operations.

The purpose of this Special Issue “Smart Farming: Cutting-Edge Technologies and Robots Used in Horticultural Crops” aims to present state-of-the-art smart farming and robotics techniques to improve the productivity of fruits, vegetables and ornamental plants. Topics of interest include (but are not limited to):

  • End-effectors, manipulators and platforms for breeding, seeding, crop protection, and harvesting
  • Path-planning, control strategies and robotic manipulation for agricultural robots
  • Autonomy and navigation in farming environments
  • Fruit, vegetable and flower detection, tracking and localization
  • Horticultural crop yield estimation/grading/monitoring/mapping
  • Pest and disease detection of horticultural crops
  • Automatic or robotic phenotyping for crop breeding

Dr. Ya Xiong
Dr. Junfeng Gao
Dr. Yunchao Tang
Dr. Antonio Candea Leite
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Horticulturae is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2200 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • agricultural robotics
  • sensing and perception
  • end-effectors and manipulation
  • navigation and autonomy
  • path planning

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

19 pages, 6641 KiB  
Article
Grape-Bunch Identification and Location of Picking Points on Occluded Fruit Axis Based on YOLOv5-GAP
by Tao Zhang, Fengyun Wu, Mei Wang, Zhaoyi Chen, Lanyun Li and Xiangjun Zou
Horticulturae 2023, 9(4), 498; https://doi.org/10.3390/horticulturae9040498 - 16 Apr 2023
Cited by 5 | Viewed by 1933
Abstract
Due to the short fruit axis, many leaves, and complex background of grapes, most grape cluster axes are blocked from view, which increases robot positioning difficulty in harvesting. This study discussed the location method for picking points in the case of partial occlusion [...] Read more.
Due to the short fruit axis, many leaves, and complex background of grapes, most grape cluster axes are blocked from view, which increases robot positioning difficulty in harvesting. This study discussed the location method for picking points in the case of partial occlusion and proposed a grape cluster-detection algorithm “You Only Look Once v5-GAP” based on “You Only Look Once v5”. First, the Conv layer of the first layer of the YOLOv5 algorithm Backbone was changed to the Focus layer, then a convolution attention operation was performed on the first three C3 structures, the C3 structure layer was changed, and the Transformer in the Bottleneck module of the last layer of the C3 structure was used to reduce the computational amount and execute a better extraction of global feature information. Second, on the basis of bidirectional feature fusion, jump links were added and variable weights were used to strengthen the fusion of feature information for different resolutions. Then, the adaptive activation function was used to learn and decide whether neurons needed to be activated, such that the dynamic control of the network nonlinear degree was realized. Finally, the combination of a digital image processing algorithm and mathematical geometry was used to segment grape bunches identified by YOLOv5-GAP, and picking points were determined after finding centroid coordinates. Experimental results showed that the average precision of YOLOv5-GAP was 95.13%, which was 16.13%, 4.34%, and 2.35% higher than YOLOv4, YOLOv5, and YOLOv7 algorithms, respectively. The average positioning pixel error of the point was 6.3 pixels, which verified that the algorithm effectively detected grapes quickly and accurately. Full article
Show Figures

Figure 1

Back to TopTop