Zobrazeno 1 - 10
of 78
pro vyhledávání: '"Yip, Nung Kwan"'
Autor:
Golovaty, Dmitry, Yip, Nung Kwan
Given a small spherical particle, we consider flow of a nematic liquid crystal in the corresponding exterior domain. Our focus is on precise far field asymptotic behavior of the flow in a parameter regime when the governing equations can be reduced t
Externí odkaz:
http://arxiv.org/abs/2409.00939
Autor:
Gao, Yuan, Yip, Nung Kwan
We prove the convergence of a Wasserstein gradient flow of a free energy in an inhomogeneous media. Both the energy and media can depend on the spatial variable in a fast oscillatory manner. In particular, we show that the gradient flow structure is
Externí odkaz:
http://arxiv.org/abs/2312.01584
In this work, we systematically investigate linear multi-step methods for differential equations with memory. In particular, we focus on the numerical stability for multi-step methods. According to this investigation, we give some sufficient conditio
Externí odkaz:
http://arxiv.org/abs/2305.06571
We analyze a nonlinear PDE system describing the motion of a microswimmer in a nematic liquid crystal environment. For the microswimmer's motility, the squirmer model is used in which self-propulsion enters the model through the slip velocity on the
Externí odkaz:
http://arxiv.org/abs/2206.06415
We study the continuum epitaxial model for elastic interacting atomic steps on vicinal surfaces proposed by Xiang and E (Xiang, SIAM J. Appl. Math. 63:241-258, 2002; Xiang and E, Phys. Rev. B 69:035409, 2004). The non-local term and the singularity c
Externí odkaz:
http://arxiv.org/abs/2204.10051
Autor:
Du, Hengrong, Yip, Nung Kwan
We show that self-similar solutions for the mean curvature flow, surface diffusion and Willmore flow of entire graphs are stable upon perturbations of initial data with small Lipschitz norm. Roughly speaking, the perturbed solutions are asymptoticall
Externí odkaz:
http://arxiv.org/abs/2108.13538
Stochastic Gradient (SG) is the defacto iterative technique to solve stochastic optimization (SO) problems with a smooth (non-convex) objective $f$ and a stochastic first-order oracle. SG's attractiveness is due in part to its simplicity of executing
Externí odkaz:
http://arxiv.org/abs/2103.04392
Publikováno v:
CSIAM Trans. Appl. Math., 3 (2022), pp. 692-760
Gradient descent yields zero training loss in polynomial time for deep neural networks despite non-convex nature of the objective function. The behavior of network in the infinite width limit trained by gradient descent can be described by the Neural
Externí odkaz:
http://arxiv.org/abs/2007.03714
It is well known that elastic effects can cause surface instability. In this paper, we analyze a one-dimensional discrete system which can reveal pattern formation mechanism resembling the "step-bunching" phenomena for epitaxial growth on vicinal sur
Externí odkaz:
http://arxiv.org/abs/2004.12279
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.