Autor: |
Stender, Merten, Ohlsen, Jakob, Geisler, Hendrik, Chabchoub, Amin, Hoffmann, Norbert, Schlaefer, Alexander |
Předmět: |
|
Zdroj: |
Computational Mechanics; Jun2023, Vol. 71 Issue 6, p1227-1249, 23p |
Abstrakt: |
In the age of big data availability, data-driven techniques have been proposed recently to compute the time evolution of spatio-temporal dynamics. Depending on the required a priori knowledge about the underlying processes, a spectrum of black-box end-to-end learning approaches, physics-informed neural networks, and data-informed discrepancy modeling approaches can be identified. In this work, we propose a purely data-driven approach that uses fully convolutional neural networks to learn spatio-temporal dynamics directly from parameterized datasets of linear spatio-temporal processes. The parameterization allows for data fusion of field quantities, domain shapes, and boundary conditions in the proposed U p -Net architecture. Multi-domain U p -Net models, therefore, can generalize to different scenes, initial conditions, domain shapes, and domain sizes without requiring re-training or physical priors. Numerical experiments conducted on a universal and two-dimensional wave equation and the transient heat equation for validation purposes show that the proposed U p -Net outperforms classical U-Net and conventional encoder–decoder architectures of the same complexity. Owing to the scene parameterization, the U p -Net models learn to predict refraction and reflections arising from domain inhomogeneities and boundaries. Generalization properties of the model outside the physical training parameter distributions and for unseen domain shapes are analyzed. The deep learning flow map models are employed for long-term predictions in a recursive time-stepping scheme, indicating the potential for data-driven forecasting tasks. This work is accompanied by an open-sourced code. [ABSTRACT FROM AUTHOR] |
Databáze: |
Complementary Index |
Externí odkaz: |
|
Nepřihlášeným uživatelům se plný text nezobrazuje |
K zobrazení výsledku je třeba se přihlásit.
|