Next Article in Journal
Target Detection Adapting to Spectral Variability in Multi-Temporal Hyperspectral Images Using Implicit Contrastive Learning
Previous Article in Journal
Lightweight Multilevel Feature-Fusion Network for Built-Up Area Mapping from Gaofen-2 Satellite Images
 
 
Article
Peer-Review Record

A Novel Fully Convolutional Auto-Encoder Based on Dual Clustering and Latent Feature Adversarial Consistency for Hyperspectral Anomaly Detection

Remote Sens. 2024, 16(4), 717; https://doi.org/10.3390/rs16040717
by Rui Zhao, Zhiwei Yang, Xiangchao Meng * and Feng Shao
Reviewer 1: Anonymous
Reviewer 2:
Reviewer 3: Anonymous
Remote Sens. 2024, 16(4), 717; https://doi.org/10.3390/rs16040717
Submission received: 22 January 2024 / Revised: 12 February 2024 / Accepted: 13 February 2024 / Published: 18 February 2024

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

In this paper, the authors propose a novel fully convolutional auto-encoder based on dual clustering and latent feature adversarial consistency for hyperspectral anomaly detection to improve spatial utilization and anomaly detection performance. There are some minor revisions needed to be revised. The detailed comments are as follows.

1.     Figure 1 shows the triplet loss and reconstruction loss, but the adversarial consistency loss is not reflected in the general block diagram, which causes some confusion to the reader.

2.     Figure 3 and Figure 4 lack the comments for each component and do not indicate the same demonstration as the comments in the previous figures. Please improve them.

3.     For the network structure, the diagram of the corresponding network structure is not mentioned. Hence, it is suggested to supplement the diagram to improve the readability of the article.

4.     In Eq. (11), how to set the three parameters?

5.     Fg. 13 and 15 are not very clear.

6.     There are still some problems in the format of the reference part, such as the lack of page numbers in reference [42].

Author Response

Dear Reviewer:

 

Many thanks for your comments.

We have carried out revisions according to your comments. Detail revisions for each detail comment can be found in the attached word file.

 

Best regards,

Rui Zhao

Corresponding author: Xiangchao Meng(mengxiangchao@nbu.edu.cn)

Author Response File: Author Response.pdf

Reviewer 2 Report

Comments and Suggestions for Authors

General comment:

In this article, the authors come up with a novel hyperspectral anomaly detection framework FCAE-DCAC which effectively solves the problems of insufficient utilization of spatial information, unreasonable background distribution assumption and lack of effective guidance during training. Moreover, the proposed framework shows prominent hyperspectral anomaly detection performance on seven experimental datasets, which exhibits obvious superiority compared with nine state-of-the-art methods. This article is decided with Minor Revision before publishing. There are some tiny issues with the detail demonstrations in this article, please check the following detail comments.

Detail comments:

1. In Eq.2, the character  lacks the interpretation of meaning. Please supplement it.

2. In Figure 3, one of the two red boxes is annotated with the module name, but the other one is not. In addition, it doesn’t indicate that they are two identical modules in this article. Please improve it.

3. The first paragraph of the ablation study has a slight shortcoming in demonstration. At the end of first paragraph, demonstration with one sentence should be added which demonstrates the effectiveness of the triplet loss.

4. In the ablation study, it has tiny deficiency in demonstration. 11 - 12% increase in overall performance is mentioned at the end of the second paragraph. Does it mean that it is changed from the third to fourth case, or from the first to fourth case? Please add a brief demonstration.

5. In parametric analysis, a summary sentence should be added at the end to demonstrate the overall parameter configuration for all of the datasets, rather than explain it for each dataset respectively which makes the readers feel unclear.

Author Response

Dear Reviewer:

 

Many thanks for your comments.

We have carried out revisions according to your comments. Detail revisions for each detail comment can be found in the attached word file.

 

Best regards,

Rui Zhao

Corresponding author: Xiangchao Meng(mengxiangchao@nbu.edu.cn)

Author Response File: Author Response.pdf

Reviewer 3 Report

Comments and Suggestions for Authors

see attachment

Comments for author File: Comments.pdf

Author Response

Dear Reviewer:

 

Many thanks for your comments.

We have carried out revisions according to your comments. Detail revisions for each detail comment can be found in the attached word file.

 

Best regards,

Rui Zhao

Corresponding author: Xiangchao Meng(mengxiangchao@nbu.edu.cn)

Author Response File: Author Response.pdf

Back to TopTop