Zobrazeno 1 - 10
of 167
pro vyhledávání: '"Niepert, Mathias"'
In this work, we propose a simple transformer-based baseline for multimodal molecular representation learning, integrating three distinct modalities: SMILES strings, 2D graph representations, and 3D conformers of molecules. A key aspect of our approa
Externí odkaz:
http://arxiv.org/abs/2410.07981
Autor:
Nguyen, Duy M. H., Diep, Nghiem T., Nguyen, Trung Q., Le, Hoang-Bao, Nguyen, Tai, Nguyen, Tien, Nguyen, TrungTin, Ho, Nhat, Xie, Pengtao, Wattenhofer, Roger, Zhou, James, Sonntag, Daniel, Niepert, Mathias
State-of-the-art medical multi-modal large language models (med-MLLM), like LLaVA-Med or BioMedGPT, leverage instruction-following data in pre-training. However, those models primarily focus on scaling the model size and data volume to boost performa
Externí odkaz:
http://arxiv.org/abs/2410.02615
Discrete diffusion models have recently shown significant progress in modeling complex data, such as natural languages and DNA sequences. However, unlike diffusion models for continuous data, which can generate high-quality samples in just a few deno
Externí odkaz:
http://arxiv.org/abs/2410.01949
In an era where large language models (LLMs) are increasingly integrated into a wide range of everyday applications, research into these models' behavior has surged. However, due to the novelty of the field, clear methodological guidelines are lackin
Externí odkaz:
http://arxiv.org/abs/2409.20303
Autor:
Musekamp, Daniel, Kalimuthu, Marimuthu, Holzmüller, David, Takamoto, Makoto, Niepert, Mathias
Solving partial differential equations (PDEs) is a fundamental problem in engineering and science. While neural PDE solvers can be more efficient than established numerical solvers, they often require large amounts of training data that is costly to
Externí odkaz:
http://arxiv.org/abs/2408.01536
Machine learning plays an increasingly important role in computational chemistry and materials science, complementing computationally intensive ab initio and first-principles methods. Despite their utility, machine-learning models often lack generali
Externí odkaz:
http://arxiv.org/abs/2408.05215
Autor:
Nguyen, Duy M. H., Le, An T., Nguyen, Trung Q., Diep, Nghiem T., Nguyen, Tai, Duong-Tran, Duy, Peters, Jan, Shen, Li, Niepert, Mathias, Sonntag, Daniel
Prompt learning methods are gaining increasing attention due to their ability to customize large vision-language models to new domains using pre-trained contextual knowledge and minimal training data. However, existing works typically rely on optimiz
Externí odkaz:
http://arxiv.org/abs/2407.04489
Transformer models are increasingly used for solving Partial Differential Equations (PDEs). Several adaptations have been proposed, all of which suffer from the typical problems of Transformers, such as quadratic memory and time complexity. Furthermo
Externí odkaz:
http://arxiv.org/abs/2406.03919
Message-passing graph neural networks (MPNNs) have emerged as a powerful paradigm for graph-based machine learning. Despite their effectiveness, MPNNs face challenges such as under-reaching and over-squashing, where limited receptive fields and struc
Externí odkaz:
http://arxiv.org/abs/2405.17311
Autor:
Tran, Hoai-Chau, Nguyen, Duy M. H., Nguyen, Duy M., Nguyen, Trung-Tin, Le, Ngan, Xie, Pengtao, Sonntag, Daniel, Zou, James Y., Nguyen, Binh T., Niepert, Mathias
Increasing the throughput of the Transformer architecture, a foundational component used in numerous state-of-the-art models for vision and language tasks (e.g., GPT, LLaVa), is an important problem in machine learning. One recent and effective strat
Externí odkaz:
http://arxiv.org/abs/2405.16148