Metadata-Guided Visual Representation Learning for Biomedical Images

Autor: Xian Zhang, Imtiaz Hossain, Stephan Spiegel, Christopher Ball
Jazyk: angličtina
Rok vydání: 2019
Předmět:
DOI: 10.1101/725754
Popis: MotivationThe clustering of biomedical images according to their phenotype is an important step in early drug discovery. Modern high-content-screening devices easily produce thousands of cell images, but the resulting data is usually unlabelled and it requires extra effort to construct a visual representation that supports the grouping according to the presented morphological characteristics.ResultsWe introduce a novel approach to visual representation learning that is guided by metadata. In high-context-screening, meta-data can typically be derived from the experimental layout, which links each cell image of a particular assay to the tested chemical compound and corresponding compound concentration. In general, there exists a one-to-many relationship between phenotype and compound, since various molecules and different dosage can lead to one and the same alterations in biological cells.Our empirical results show that metadata-guided visual representation learning is an effective approach for clustering biomedical images. We have evaluated our proposed approach on both benchmark and real-world biological data. Furthermore, we have juxtaposed implicit and explicit learning techniques, where both loss function and batch construction differ. Our experiments demonstrate that metadata-guided visual representation learning is able to identify commonalities and distinguish differences in visual appearance that lead to meaningful clusters, even without image-level annotations.NotePlease refer to the supplementary material for implementation details on metadata-guided visual representation learning strategies.
Databáze: OpenAIRE