Using Scene Graph Context to Improve Image Generation

Autor: Tripathi, Subarna, Bhiwandiwalla, Anahita, Bastidas, Alexei, Tang, Hanlin
Rok vydání: 2019
Předmět:
Druh dokumentu: Working Paper
Popis: Generating realistic images from scene graphs asks neural networks to be able to reason about object relationships and compositionality. As a relatively new task, how to properly ensure the generated images comply with scene graphs or how to measure task performance remains an open question. In this paper, we propose to harness scene graph context to improve image generation from scene graphs. We introduce a scene graph context network that pools features generated by a graph convolutional neural network that are then provided to both the image generation network and the adversarial loss. With the context network, our model is trained to not only generate realistic looking images, but also to better preserve non-spatial object relationships. We also define two novel evaluation metrics, the relation score and the mean opinion relation score, for this task that directly evaluate scene graph compliance. We use both quantitative and qualitative studies to demonstrate that our pro-posed model outperforms the state-of-the-art on this challenging task.
Comment: arXiv admin note: text overlap with arXiv:1804.01622 by other authors
Databáze: arXiv