Paper
23 August 2022 BiLSTM-IN-TRANS for Chinese NER
Jun Yin, Cui Zhu, Wenjun Zhu
Author Affiliations +
Proceedings Volume 12305, International Symposium on Artificial Intelligence Control and Application Technology (AICAT 2022); 1230511 (2022) https://doi.org/10.1117/12.2645478
Event: International Symposium on Artificial Intelligence Control and Application Technology (AICAT 2022), 2022, Hangzhou, China
Abstract
Transformer has good feature extraction ability and has achieved good performance on various NLP tasks such as sentence classification, machine translation and reading comprehension, but it does not perform well in named entity recognition tasks. According to recent researches, the Long Short-Term Memory (LSTM) usually performs better than Transformer in NER task. LSTM is a variant of Recurrent Neural Network (RNN), because of its natural chain structure, it can learn the front and back dependencies between words well, which is very suitable for processing text sequences. In this paper, the BiLSTM network structure is embedded into the Transformer Encoder, and a new network structure BiLSTM-IN-TRANS is proposed, which combines the sequential feature extraction capability of BiLSTM and the powerful global feature extraction capability of Transformer Encoder. The experiments can reflect that applying the model based on BiLSTM-INTRANS could work better than applying one of LSTM or Transformer alone in the NER task.
© (2022) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Jun Yin, Cui Zhu, and Wenjun Zhu "BiLSTM-IN-TRANS for Chinese NER", Proc. SPIE 12305, International Symposium on Artificial Intelligence Control and Application Technology (AICAT 2022), 1230511 (23 August 2022); https://doi.org/10.1117/12.2645478
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Transformers

Computer programming

Data modeling

Feature extraction

Neural networks

Information technology

Parallel computing

RELATED CONTENT

Stock text topic recognition based on Stu-BERT
Proceedings of SPIE (June 15 2022)
Toxic detection based on RoBERTa and TF-IDF
Proceedings of SPIE (November 10 2022)
Event extraction enhanced by fusing trigger feature
Proceedings of SPIE (February 03 2023)
Use transformer encoder for KPI anomaly detection
Proceedings of SPIE (May 05 2022)

Back to Top