Zobrazeno 1 - 10
of 1 111
pro vyhledávání: '"Zhao Bowen"'
Multimodal Question Answering (MMQA) is crucial as it enables comprehensive understanding and accurate responses by integrating insights from diverse data representations such as tables, charts, and text. Most existing researches in MMQA only focus o
Externí odkaz:
http://arxiv.org/abs/2410.21414
Homomorphic secret sharing (HSS) enables two servers to locally perform functions on encrypted data directly and obtain the results in the form of shares. A Paillier-based HSS solution seamlessly achieves multiplicative homomorphism and consumes less
Externí odkaz:
http://arxiv.org/abs/2410.06514
Large vision-language models (VLMs) have become state-of-the-art for many computer vision tasks, with in-context learning (ICL) as a popular adaptation strategy for new ones. But can VLMs learn novel concepts purely from visual demonstrations, or are
Externí odkaz:
http://arxiv.org/abs/2409.17080
A quasi-local mass, typically defined as an integral over a spacelike $2$-surface $\Sigma$, should encode information about the gravitational field within a finite, extended region bounded by $\Sigma$. Therefore, in attempts to quantize gravity, one
Externí odkaz:
http://arxiv.org/abs/2407.00593
We look at the strong field behavior of the Wang-Yau quasi-local energy. In particular, we examine the limit of the Wang-Yau quasi-local energy as the defining spacelike $2$-surface $\Sigma$ approaches an apparent horizon from outside. Assuming that
Externí odkaz:
http://arxiv.org/abs/2406.10751
We resolve the nature of the quantum phase transition between a N\'eel antiferromagnet and a valence-bond solid in two-dimensional spin-1/2 magnets. We study a class of $J$-$Q$ models, in which Heisenberg exchange $J$ competes with interactions $Q_n$
Externí odkaz:
http://arxiv.org/abs/2405.06607
We review Wang-Yau quasi-local definitions along the line of gravitational Hamiltonian. This makes clear the connection and difference between Wang-Yau definition and Brown-York or even global ADM definition. We make a brief comment on admissibility
Externí odkaz:
http://arxiv.org/abs/2402.19310
Language models (LMs) are trained on web text originating from many points in time and, in general, without any explicit temporal grounding. This work investigates the temporal chaos of pretrained LMs and explores various methods to align their inter
Externí odkaz:
http://arxiv.org/abs/2402.16797
Fine-tuning and inference with large Language Models (LM) are generally known to be expensive. Parameter-efficient fine-tuning over pretrained LMs reduces training memory by updating a small number of LM parameters but does not improve inference effi
Externí odkaz:
http://arxiv.org/abs/2401.12200
Autor:
Zhao, Bowen, Ji, Changkai, Zhang, Yuejie, He, Wen, Wang, Yingwen, Wang, Qing, Feng, Rui, Zhang, Xiaobo
With the Generative Pre-trained Transformer 3.5 (GPT-3.5) exhibiting remarkable reasoning and comprehension abilities in Natural Language Processing (NLP), most Question Answering (QA) research has primarily centered around general QA tasks based on
Externí odkaz:
http://arxiv.org/abs/2312.11521