A Transformer-based Neural Language Model that Synthesizes Brain Activation Maps from Free-Form Text Queries

Autor: Ngo, Gia H., Nguyen, Minh, Chen, Nancy F., Sabuncu, Mert R.
Rok vydání: 2022
Předmět:
Zdroj: Medical Image Analysis. 2022 Jul 19:102540
Druh dokumentu: Working Paper
DOI: 10.1016/j.media.2022.102540
Popis: Neuroimaging studies are often limited by the number of subjects and cognitive processes that can be feasibly interrogated. However, a rapidly growing number of neuroscientific studies have collectively accumulated an extensive wealth of results. Digesting this growing literature and obtaining novel insights remains to be a major challenge, since existing meta-analytic tools are constrained to keyword queries. In this paper, we present Text2Brain, an easy to use tool for synthesizing brain activation maps from open-ended text queries. Text2Brain was built on a transformer-based neural network language model and a coordinate-based meta-analysis of neuroimaging studies. Text2Brain combines a transformer-based text encoder and a 3D image generator, and was trained on variable-length text snippets and their corresponding activation maps sampled from 13,000 published studies. In our experiments, we demonstrate that Text2Brain can synthesize meaningful neural activation patterns from various free-form textual descriptions. Text2Brain is available at https://braininterpreter.com as a web-based tool for efficiently searching through the vast neuroimaging literature and generating new hypotheses.
Comment: arXiv admin note: text overlap with arXiv:2109.13814
Databáze: arXiv