Entity Recognition Approach of Clinical Documents Based on Self-training Framework
Autor: | Nannan Che, Jiajin Le, Dehua Chen |
---|---|
Rok vydání: | 2018 |
Předmět: |
Statement (computer science)
Parsing Dependency (UML) Computer science business.industry Supervised learning Construct (python library) computer.software_genre Task (project management) Annotation ComputingMethodologies_PATTERNRECOGNITION Feature (machine learning) Artificial intelligence business computer Natural language processing |
Zdroj: | Advances in Intelligent Systems and Computing ISBN: 9789811089435 |
DOI: | 10.1007/978-981-10-8944-2_31 |
Popis: | Entity recognition of clinical documents is a primary task to extract information from unstructured clinical documents. Traditional entity recognition methods extract entities in a supervised learning framework which needs a large scale of labeled corpus as the training samples. However, clinical documents in real world are unlabeled. To construct a large scale of labeled corpus by manual is time-consuming. Semi-supervised learning that relies on small-scale corpus can solve such problem. Thus, this paper proposes an entity recognition model of clinical documents based on self-training framework. In such framework, we first establish partial annotation corpus through the way of dependency syntax analysis and the medical statement rule unifies. Then, a hybrid model of CNN-LSTM-CRF is proposed to label the unlabeled data in an end-to-end way. Specially, we will use CNN to embed characters in clinical document and use Bi-LSTM to extract the sentence-level feature. At the moment, we use CRF remedies the shortage of LSTM which further combined with the combination probability of CRF and the advantages of optimizing the whole sequence. Finally, the results of entity recognition with higher confidence level are fed back by self-training to expend size of corpus which improves the accuracy of the document entity recognition. The experiment result proves the availability and high efficiency of this model. |
Databáze: | OpenAIRE |
Externí odkaz: |