Author
Listed:
- Fangling Leng
(School of Computer Science and Engineering, Northeastern University, Shenyang 110169, China)
- Fan Li
(School of Computer Science and Engineering, Northeastern University, Shenyang 110169, China)
- Yubin Bao
(School of Computer Science and Engineering, Northeastern University, Shenyang 110169, China)
- Tiancheng Zhang
(School of Computer Science and Engineering, Northeastern University, Shenyang 110169, China)
- Ge Yu
(School of Computer Science and Engineering, Northeastern University, Shenyang 110169, China)
Abstract
Regarding the existing models for feature extraction of complex similar entities, there are problems in the utilization of relative position information and the ability of key feature extraction. The distinctiveness of Chinese named entity recognition compared to English lies in the absence of space delimiters, significant polysemy and homonymy of characters, diverse and common names, and a greater reliance on complex contextual and linguistic structures. An entity recognition method based on DeBERTa-Attention-BiLSTM-CRF (DABC) is proposed. Firstly, the feature extraction capability of the DeBERTa model is utilized to extract the data features; then, the attention mechanism is introduced to further enhance the extracted features; finally, BiLSTM is utilized to further capture the long-distance dependencies in the text and obtain the predicted sequences through the CRF layer, and then the entities in the text are identified. The proposed model is applied to the dataset for validation. The experiments show that the precision ( P ) of the proposed DABC model on the dataset reaches 88.167%, the recall ( R ) reaches 83.121%, and the F 1 value reaches 85.024%. Compared with other models, the F 1 value improves by 3∼5%, and the superiority of the model is verified. In the future, it can be extended and applied to recognize complex entities in more fields.
Suggested Citation
Fangling Leng & Fan Li & Yubin Bao & Tiancheng Zhang & Ge Yu, 2024.
"DABC: A Named Entity Recognition Method Incorporating Attention Mechanisms,"
Mathematics, MDPI, vol. 12(13), pages 1-15, June.
Handle:
RePEc:gam:jmathe:v:12:y:2024:i:13:p:1992-:d:1423979
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:12:y:2024:i:13:p:1992-:d:1423979. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.