Next Article in Journal
Personalized Search Using User Preferences on Social Media
Previous Article in Journal
Facial Action Unit Recognition by Prior and Adaptive Attention
 
 
Article
Peer-Review Record

A Multitask Learning Approach for Named Entity Recognition by Exploiting Sentence-Level Semantics Globally

Electronics 2022, 11(19), 3048; https://doi.org/10.3390/electronics11193048
by Wenzhi Huang 1, Tao Qian 2,*, Chen Lyu 3, Junchi Zhang 1, Guonian Jin 2, Yongkui Li 2 and Yongrui Xu 2
Reviewer 1:
Reviewer 2: Anonymous
Reviewer 3:
Electronics 2022, 11(19), 3048; https://doi.org/10.3390/electronics11193048
Submission received: 31 July 2022 / Revised: 10 September 2022 / Accepted: 21 September 2022 / Published: 24 September 2022
(This article belongs to the Section Computer Science & Engineering)

Round 1

Reviewer 1 Report

The paper presents a framework to solve the task of named entity recognition in NLP. The authors' approach uses sentence-level token representation instead of a more popular feature representation of a token. 

The paper provides a comprehensive description of the research methodology which includes the stages of model development, training, experimental check and evaluation of a developed model. 

The authors provide the experimental and benchmarking data which confirm the efficiency of the presented model. 

The argument presented is logically sound and supported by a sufficient amount of experimental data presented. 

The text of the manuscript is well-structured, clear and sufficiently detailed. 

Author Response

We sincerely thank the reviewer for the kind comments and valuable suggestions.

Reviewer 2 Report

In this paper, author Presented a joint model to exploit the richer sentence-level token representation for Name entity recognition. However, there are some limitations that must be addressed as follows.

1.        The abstract is not attractive. Some sentences in abstract should be modified to make it more attractive for readers. Also correct the grammatical mistakes.

2.        There are two introduction sections. I think the second should related work section.

3.        In Introduction section, it is difficult to understand the novelty of the presented research work. The main contribution should be presented in the form of bullets.

4.        The most recent work and NLP (NER) should be discussed as follows (‘Traffic accident detection and condition analysis based on social networking’, ‘Towards zero-shot knowledge distillation for natural language processing.’).

5.        It is better to include figures in word representation section see this A novel text mining approach for mental health prediction using Bi-LSTM and BERT model.

6.        Experiments are well presented. However, Figure 4 is difficult to understand.

7.        The whole manuscript should be thoroughly revised in order to improve its English.

8.        Future work should be discussed in Conclusion.

Author Response

We sincerely appreciate the reviewer for the comprehensive and valuable suggestions to examine our manuscript. 

Please see the attachment for the response.

Author Response File: Author Response.pdf

Reviewer 3 Report

This paper proposed a model to exploit the richer sentence-level token representation for Named entity recognition, which is an very interesting task in the NLP field. Towards this, the authors designed a new auxiliary task, sentence-level entity type sequence prediction, for better model training. 

I only have several concerns about the paper.

First, it is better to have some figures for better presentation of this paper. 

Second, I was puzzled by equation (12), any explanations?

Third, other than BiLSTM, there are more recent works related to Named entity recognition, such as "Switchable Novel Object Captioner, TPAMI 22". The authors are supposed to discuss the related work in the paper.

Last, there is something confusing in Table 5. There are two `BaseJoint+BERT` in the table but with different numbers.  I suppose the first one to be without BERT.

 

Minor suggestions:

It is better to add more visual examples in the paper.

Author Response

We sincerely appreciate the reviewer for the comprehensive and valuable suggestions to examine our manuscript. 

Please see the attachment for the response.

Author Response File: Author Response.pdf

Reviewer 4 Report

This paper presents a model to exploit the richer sentence-level token representation for named entity recognition. The topic, in general, is excellent. The work is supported by experiments that have been done on four datasets in different languages and domains, showing promising results. I would recommend the authors add some motivation for the current research that attaches it to the interest in electronics.  For instance, the authors needs to think on computing elements which will make these experiments run on embedded systems or even higher performance computing cluster. How much energy will be consumed to run such a model as the one they are presenting in the current submission. 

Author Response

We sincerely appreciate the reviewer for the comprehensive and valuable suggestions to examine our manuscript. 

Please see the attachment for the response.

Author Response File: Author Response.pdf

Round 2

Reviewer 2 Report

This paper can be accepted in its present form. 

Back to TopTop