Named entity recognition is an important basic task in natural language processing.This paper proposes a named entity recognition method for end-to-end and efficient deep-loop neural networks.Using BERT to train sub-vectors as raw input enables the model to obtain more comprehensive text information,and at the same time,the BiLSTM network is focused on the attention mechanism,so that the network pays more attention to the key information in the text and ignores the redundant information to improve the recognition efficiency of the model.Finally,the relationship between any two tags is captured by the CRT layer,and the entire sentence is decoded and predicted.Experiments show that the method performs well on MSRA corpus.
展开▼