Zobrazeno 1 - 10
of 121
pro vyhledávání: '"MOYA, CHRISTIAN"'
We propose a novel fine-tuning method to achieve multi-operator learning through training a distributed neural operator with diverse function data and then zero-shot fine-tuning the neural network using physics-informed losses for downstream tasks. O
Externí odkaz:
http://arxiv.org/abs/2411.07239
Optimizing the learning rate remains a critical challenge in machine learning, essential for achieving model stability and efficient convergence. The Vector Auxiliary Variable (VAV) algorithm introduces a novel energy-based self-adjustable learning r
Externí odkaz:
http://arxiv.org/abs/2411.06573
Autor:
Mollaali, Amirhossein, Zufferey, Gabriel, Constante-Flores, Gonzalo, Moya, Christian, Li, Can, Lin, Guang, Yue, Meng
This paper proposes a new data-driven methodology for predicting intervals of post-fault voltage trajectories in power systems. We begin by introducing the Quantile Attention-Fourier Deep Operator Network (QAF-DeepONet), designed to capture the compl
Externí odkaz:
http://arxiv.org/abs/2410.24162
In this paper, we adopt conformal prediction, a distribution-free uncertainty quantification (UQ) framework, to obtain confidence prediction intervals with coverage guarantees for Deep Operator Network (DeepONet) regression. Initially, we enhance the
Externí odkaz:
http://arxiv.org/abs/2402.15406
Approximate Thompson sampling with Langevin Monte Carlo broadens its reach from Gaussian posterior sampling to encompass more general smooth posteriors. However, it still encounters scalability issues in high-dimensional problems when demanding high
Externí odkaz:
http://arxiv.org/abs/2401.11665
Autor:
Mollaali, Amirhossein, Sahin, Izzet, Raza, Iqrar, Moya, Christian, Paniagua, Guillermo, Lin, Guang
In the pursuit of accurate experimental and computational data while minimizing effort, there is a constant need for high-fidelity results. However, achieving such results often requires significant computational resources. To address this challenge,
Externí odkaz:
http://arxiv.org/abs/2311.03639
Neural operators have been applied in various scientific fields, such as solving parametric partial differential equations, dynamical systems with control, and inverse problems. However, challenges arise when dealing with input functions that exhibit
Externí odkaz:
http://arxiv.org/abs/2310.18888
We present a new framework for computing fine-scale solutions of multiscale Partial Differential Equations (PDEs) using operator learning tools. Obtaining fine-scale solutions of multiscale PDEs can be challenging, but there are many inexpensive comp
Externí odkaz:
http://arxiv.org/abs/2308.14188
This paper designs surrogate models with uncertainty quantification capabilities to improve the thermal performance of rib-turbulated internal cooling channels effectively. To construct the surrogate, we use the deep operator network (DeepONet) frame
Externí odkaz:
http://arxiv.org/abs/2306.00810
This paper presents NSGA-PINN, a multi-objective optimization framework for effective training of Physics-Informed Neural Networks (PINNs). The proposed framework uses the Non-dominated Sorting Genetic Algorithm (NSGA-II) to enable traditional stocha
Externí odkaz:
http://arxiv.org/abs/2303.02219