Next Article in Journal
Adaptive Dead-Time Control Design with Low Dead-Time Error in 20 MHz 90 V GaN Gate Driver
Next Article in Special Issue
Research on Named Entity Recognition for Spoken Language Understanding Using Adversarial Transfer Learning
Previous Article in Journal
Far-Field Wireless Power Transfer for the Internet of Things
Previous Article in Special Issue
Entity Factor: A Balanced Method for Table Filling in Joint Entity and Relation Extraction
 
 
Article
Peer-Review Record

ENEX-FP: A BERT-Based Address Recognition Model

Electronics 2023, 12(1), 209; https://doi.org/10.3390/electronics12010209
by Min Li, Zeyu Liu, Gang Li, Mingle Zhou * and Delong Han
Reviewer 1:
Reviewer 2:
Reviewer 3: Anonymous
Electronics 2023, 12(1), 209; https://doi.org/10.3390/electronics12010209
Submission received: 14 December 2022 / Revised: 27 December 2022 / Accepted: 28 December 2022 / Published: 1 January 2023
(This article belongs to the Special Issue Natural Language Processing and Information Retrieval)

Round 1

Reviewer 1 Report

The authors propose an address recognition model based on BERT.

There are some recent works missing in the state of the art. Please provide a more extensive state-of-the-art and provide a comparison, indicating advantages, disadvantages and why is your solution different, since the approach is close to previous approaches.

The model in figure 1 must be more detailed. I am unable to replicate the work with this information only. How many LSTMs are in the chain, the encoder architectures, how many encoders there are, etc.

The model in figure 4 must be also detailed: number of layers, number of nodes, dropout rate, etc.

Considering the dataset, how many address elements are considered? What changes if a different number of elements are considered?

Do all models considered in figure 5 run the same dataset in the same conditions?

The results in table 3 are not that different. How many runs did you execute? Are you considering learning rate optimization and adversarial training in the compared algorithms? The issue is trying to clearly identify what improves the solution and if it can be applied to other cases or models.

Author Response

Dear Professor,

Thank you for your comments on our article entitled "ENEX-FP: A BERT-Based Address Recognition Model" (ID: Electronic-2128558). These opinions are of great help to the revision and improvement of our paper, and also have important guiding significance for our research. We have carefully studied these comments and made corrections, which we hope will be approved by you. We have replied to your questions point-to-point and uploaded them in the form of an attachment. Please see the attachment for details.

Yours Sincerely,
Zeyu Liu
E-mail: 10431220439@stu.qlu.edu.cn

Author Response File: Author Response.pdf

Reviewer 2 Report

This paper proposes an ENEX-FP model for entity recognition of address. It has three main components:

1. ENEX model, which extracts the vector of BERT as the word vector input to the BILSTM model and the context information feature vector processed by the BERT model.

2. feature processor FP model, which performs feature rounding on the ENEX output vector.

3. adversarial training while the model is training.

Adding adversarial training looks like a good idea. As shown in the ablation study, it improves the model performance.

Some issues can be addressed:

Most importantly, the data used for comparison is not enough. Only one dataset is used for method comparison. More could be added here, which is expected.

Other writing suggestions:

1. Line 117 is not cited properly "By using adversarial training to improve the robustness, the model’s accuracy can be further optimized. Szegedy et al".

Similarly for lines 120-121.

2. Figures 2 and 3 are poorly plotted. And not necessary.

3. Figure 4 is too big

 

Author Response

Dear Professor,

Thank you for your comments on our article entitled "ENEX-FP: A BERT-Based Address Recognition Model" (ID: Electronic-2128558). These opinions are of great help to the revision and improvement of our paper, and also have important guiding significance for our research. We have carefully studied these comments and made corrections, which we hope will be approved by you. We have replied to your questions point-to-point and uploaded them in the form of an attachment. Please see the attachment for details.

Yours Sincerely,
Zeyu Liu
E-mail: 10431220439@stu.qlu.edu.cn

Author Response File: Author Response.pdf

Reviewer 3 Report

This article focuses on the ENEX-FP address recognition model consisting of an n Entity Extractor (ENEX) and a Feature Processor (FP) for named entity recognition of addresses. The authors consider the problems of parsing addresses containing such features as free writing, many default aliases, and a strong locality. The authors' contribution consists in, firstly, developing the ENEX entity extractor, which fully uses the output of BERT as word vector embedding and feature vector extraction, secondly, creating the feature processor FP model, and thirdly, applying a special reduction strategy a learning rate decay strategy and a hierarchical setting learning rate operation to improve the model's accuracy effectively.

A few remarks that should be corrected:

1. The thesis through lines 38-41 is not supported by any references and also cast doubt.

2. Related Work section is uninformative and should be improved. In particular, provide a more detailed explanation of the approaches described, their accuracy, and the data used.

3. The article contains unencrypted abbreviations and, in contrast, full naming after the mentioned abbreviations. They should be unified throughout the text.

4. Figure 4 seems unfinished. Signatures should be added for all incoming and outgoing arrows.

5. It is not clear from section 4.2 how the number of names, maximum length, batch size, and dropout were determined. Please provide this information.

6. Why don't the authors use the cross-validation procedure as the most objective method of evaluating the developed approach? As a rule, accuracy metrics are overestimated when using a strictly generated test dataset.

7. Have the authors attempted to make comparisons with other researchers' approaches to solving the problem of identifying address elements? To make the authors' contributions clearer, we should make this comparison and discuss the results.

Author Response

Dear Professor,

Thank you for your comments on our article entitled "ENEX-FP: A BERT-Based Address Recognition Model" (ID: Electronic-2128558). These opinions are of great help to the revision and improvement of our paper, and also have important guiding significance for our research. We have carefully studied these comments and made corrections, which we hope will be approved by you. We have replied to your questions point-to-point and uploaded them in the form of an attachment. Please see the attachment for details.

Yours Sincerely,
Zeyu Liu
E-mail: 10431220439@stu.qlu.edu.cn

Author Response File: Author Response.pdf

Round 2

Reviewer 1 Report

The authors have addressed all issues raised by the reviewers and improved the article accordingly.

Reviewer 2 Report

Consider replies for other reviewers and the revision, I think it looks good now.

Back to TopTop