Next Article in Journal
Climate Change Modulates Halophyte Secondary Metabolites to Reshape Rhizosphere Halobacteria for Biosaline Agriculture
Next Article in Special Issue
Metaverse for Digital Anti-Aging Healthcare: An Overview of Potential Use Cases Based on Artificial Intelligence, Blockchain, IoT Technologies, Its Challenges, and Future Directions
Previous Article in Journal
Skin Condition and Behavioral Factors in High-Performance Athletes Based on the Example of Professional Dance—An Explorative Pilot Project
Previous Article in Special Issue
Auto-Scoring Feature Based on Sentence Transformer Similarity Check with Korean Sentences Spoken by Foreigners
 
 
Article
Peer-Review Record

A Self-Adaptive Approximated-Gradient-Simulation Method for Black-Box Adversarial Sample Generation

Appl. Sci. 2023, 13(3), 1298; https://doi.org/10.3390/app13031298
by Yue Zhang 1, Seong-Yoon Shin 2,*, Xujie Tan 1,* and Bin Xiong 3
Reviewer 1:
Reviewer 2: Anonymous
Appl. Sci. 2023, 13(3), 1298; https://doi.org/10.3390/app13031298
Submission received: 11 December 2022 / Revised: 16 January 2023 / Accepted: 16 January 2023 / Published: 18 January 2023
(This article belongs to the Special Issue Future Information & Communication Engineering 2022)

Round 1

Reviewer 1 Report

The paper discusses a novel method of perturbation sample generation for self-adaptive approximated gradient simulation method by which black-box adversarial attacks can be performed. The paper well describes the proposed methodology and correctly evaluates the results in comparison to other most known attack methods. 

It is worth mentioning that the domain is a popular research field and that there exist many more methods of attacks published very recently. Therefore it is hard to judge the true value of the proposed research as not all published approaches could be justifiably compared to the method proposed by the paper authors. 

From the title I am not well informed whether the method will be about performing or preventing black-bod adversarial attacks. Further, the abstract in the first two sentences gives the impression that DNN adversarial attacks only occur in image classification tasks. 

Author Response

Thanks to the editor and reviewers for their detailed and constructive comments which help us to improve the paper. According to the comments, we have made the following revisions. Efforts were also made to correct the mistakes and improve the English of the paper.

Author Response File: Author Response.pdf

Reviewer 2 Report

 The researchers used a deep learning model   to  detect the black box  attack from different types  of  image. authors  have  done well   Major  1-  The experiment results part is not explained well,   2 - The quality of the  performance  of the deep learning model is very poor and not explained. 3-  The performance  metrics  like accuracy ,  persian, sensitivity are not shown  4. The results of authors model should be compared with  recent existing models    Minor  1- the article need proofreading    2-  Most  of reference  are  old,  the reference should be in last three years 

Author Response

Thanks to the editor and reviewers for their detailed and constructive comments which help us to improve the paper. According to the comments, we have made the following revisions. Efforts were also made to correct the mistakes and improve the English of the paper.

Author Response File: Author Response.pdf

Round 2

Reviewer 2 Report

Thank you  to authors

The authors were not addressed all comments



1- The results of authors model should be compared with recent existing models

I am not finding  responses  to this, I am asking with existing  work 

Author Response

Thanks for your suggestion. According to the suggestions of you, we have added a model experiment for comparison.

Author Response File: Author Response.pdf

Round 3

Reviewer 2 Report

Thank you to authors, but i can't to give  acceptance and still my comments not solve 

1- I asked authors  to use some common evaluation   metric like , sensitivity ....etc 

2- We can trust to your result  if   i have seen comparative study with existing   research 

Author Response

Thanks for your suggestion. According to it, we have added part 4.2.3. Please see the attachment.

Author Response File: Author Response.pdf

Back to TopTop