Cheap, Fast, and Good Enough for the Non-biomedical Domain but is It Usable for Clinical Natural Language Processing? Evaluating Crowdsourcing for Clinical Trial Announcement Named Entity Annotations
Autor: | Todd Lingren, Qi Li, Megan Kaiser, Laura Stoutenborough, Imre Solti, Louise Deléger, Haijun Zhai |
---|---|
Rok vydání: | 2012 |
Předmět: |
Information retrieval
business.industry Computer science Usability Crowdsourcing computer.software_genre USable Domain (software engineering) Outsourcing Clinical trial Named entity Crowdsourcing software development ComputerApplications_MISCELLANEOUS Artificial intelligence business computer Natural language processing |
Zdroj: | HISB |
DOI: | 10.1109/hisb.2012.31 |
Popis: | Building upon previous work from the general crowdsourcing research, this study investigates the usability of crowdsourcing in the clinical NLP domain for annotating medical named entities and entity linkages in a clinical trial announcement (CTA) corpus. The results indicate that crowdsourcing is a feasible, inexpensive, fast, and practical approach to annotate clinical text (without PHI) on large scale for medical named entities. The crowdsourcing program code was released publicly. |
Databáze: | OpenAIRE |
Externí odkaz: |