Deep Learning YOLO-Based Solution for Grape Bunch Detection and Assessment of Biophysical Lesions
Round 1
Reviewer 1 Report
check the attachment
Comments for author File: Comments.pdf
Author Response
Please see the attachment.
Author Response File: Author Response.pdf
Reviewer 2 Report
This paper focused on grape bunch detection and damage classification based on transferred-learned YOLO-based DL approaches. Overall, it is an interesting topic to the general audience and the paper was written in a good flow. The results are promising and the models are applicable to general usage. That said, there are still some minor comments concerning the data and results. First of all, as one of the main advantages of YOLO is its efficiency and real-time inference, How much improvement or compromise in speed the three models achieved or sacrificed compared with the state-of-art models? Besides, although authors showcased a few failed cases of the model, there is still inadequate analysis on the limitation of the model performance. For some common scenarios, such as occlusion and densely arranged objects, that are common in practical applications but might be underrepresented in public datasets, how well these three models could handle them? Is there any specific type of complexity any of these three models tend to fail? At what degree of occlusion or overlapping the model started to fail? A further evaluation of the results is thus recommended. Moreover, how well these models would work on images taken from a different sensor under a different illumination condition at different shooting distances and angles? Since most of the images were from the same phone camera, did the model overfit to this type of data source? Also, as regard to some difficulties discussed by authors, if possible, bringing in spectral information such as NIR or red-edge may also be helpful and could still be useful for application considering the wide usage of NIR spectrometry. A more extended discussion on the application of models would also be beneficial. The color of bounding boxes in some figures are also hard to read. A more contrasting color is suggested.
Author Response
Please see the attachment.
Author Response File: Author Response.pdf