Text2Mesh: Text-Driven Neural Stylization for Meshes
Autor: | Michel, Oscar, Bar-On, Roi, Liu, Richard, Benaim, Sagie, Hanocka, Rana |
---|---|
Rok vydání: | 2022 |
Předmět: |
FOS: Computer and information sciences
Computer Science - Computation and Language Computer Science - Graphics Computer Vision and Pattern Recognition (cs.CV) Computer Science - Computer Vision and Pattern Recognition Computation and Language (cs.CL) Graphics (cs.GR) ComputingMethodologies_COMPUTERGRAPHICS |
Zdroj: | 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). |
DOI: | 10.1109/cvpr52688.2022.01313 |
Popis: | In this work, we develop intuitive controls for editing the style of 3D objects. Our framework, Text2Mesh, stylizes a 3D mesh by predicting color and local geometric details which conform to a target text prompt. We consider a disentangled representation of a 3D object using a fixed mesh input (content) coupled with a learned neural network, which we term neural style field network. In order to modify style, we obtain a similarity score between a text prompt (describing style) and a stylized mesh by harnessing the representational power of CLIP. Text2Mesh requires neither a pre-trained generative model nor a specialized 3D mesh dataset. It can handle low-quality meshes (non-manifold, boundaries, etc.) with arbitrary genus, and does not require UV parameterization. We demonstrate the ability of our technique to synthesize a myriad of styles over a wide variety of 3D meshes. project page: https://threedle.github.io/text2mesh/ |
Databáze: | OpenAIRE |
Externí odkaz: |