Autor: |
Lena, Maier-Hein, Sven, Mersmann, Daniel, Kondermann, Christian, Stock, Hannes Gotz, Kenngott, Alexandro, Sanchez, Martin, Wagner, Anas, Preukschas, Anna-Laura, Wekerle, Stefanie, Helfert, Sebastian, Bodenstedt, Stefanie, Speidel |
Rok vydání: |
2014 |
Předmět: |
|
Zdroj: |
Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention. 17(Pt 2) |
Popis: |
Computer-assisted minimally-invasive surgery (MIS) is often based on algorithms that require establishing correspondences between endoscopic images. However, reference annotations frequently required to train or validate a method are extremely difficult to obtain because they are typically made by a medical expert with very limited resources, and publicly available data sets are still far too small to capture the wide range of anatomical/scene variance. Crowdsourcing is a new trend that is based on outsourcing cognitive tasks to many anonymous untrained individuals from an online community. To our knowledge, this paper is the first to investigate the concept of crowdsourcing in the context of endoscopic video image annotation for computer-assisted MIS. According to our study on publicly available in vivo data with manual reference annotations, anonymous non-experts obtain a median annotation error of 2 px (n = 10,000). By applying cluster analysis to multiple annotations per correspondence, this error can be reduced to about 1 px, which is comparable to that obtained by medical experts (n = 500). We conclude that crowdsourcing is a viable method for generating high quality reference correspondences in endoscopic video images. |
Databáze: |
OpenAIRE |
Externí odkaz: |
|