Self-supervised Multi-scale Consistency for Weakly Supervised Segmentation Learning

Autor: Valvano, Gabriele, Leo, Andrea, Tsaftaris, Sotirios A.
Rok vydání: 2021
Předmět:
Druh dokumentu: Working Paper
Popis: Collecting large-scale medical datasets with fine-grained annotations is time-consuming and requires experts. For this reason, weakly supervised learning aims at optimising machine learning models using weaker forms of annotations, such as scribbles, which are easier and faster to collect. Unfortunately, training with weak labels is challenging and needs regularisation. Herein, we introduce a novel self-supervised multi-scale consistency loss, which, coupled with an attention mechanism, encourages the segmentor to learn multi-scale relationships between objects and improves performance. We show state-of-the-art performance on several medical and non-medical datasets. The code used for the experiments is available at https://vios-s.github.io/multiscale-pyag.
Comment: Accepted at Domain Adaptation and Representation Transfer (DART) 2021
Databáze: arXiv