Zobrazeno 1 - 10
of 1 026
pro vyhledávání: '"MA, Xinyu"'
Autor:
Xu, Yongxin, Zhang, Ruizhe, Jiang, Xinke, Feng, Yujie, Xiao, Yuzhen, Ma, Xinyu, Zhu, Runchuan, Chu, Xu, Zhao, Junfeng, Wang, Yasha
Retrieval-Augmented Generation (RAG) offers an effective solution to the issues faced by Large Language Models (LLMs) in hallucination generation and knowledge obsolescence by incorporating externally retrieved knowledge. However, existing methods la
Externí odkaz:
http://arxiv.org/abs/2410.10360
Autor:
Sun, Weiwei, Shi, Zhengliang, Wu, Jiulong, Yan, Lingyong, Ma, Xinyu, Liu, Yiding, Cao, Min, Yin, Dawei, Ren, Zhaochun
Recent information retrieval (IR) models are pre-trained and instruction-tuned on massive datasets and tasks, enabling them to perform well on a wide range of tasks and potentially generalize to unseen tasks with instructions. However, existing IR be
Externí odkaz:
http://arxiv.org/abs/2410.10127
Despite advancements in enhancing LLM safety against jailbreak attacks, evaluating LLM defenses remains a challenge, with current methods often lacking explainability and generalization to complex scenarios, leading to incomplete assessments (e.g., d
Externí odkaz:
http://arxiv.org/abs/2410.12855
Metaverse applications desire to communicate with semantically identified objects among a diverse set of cyberspace entities, such as cameras for collecting images from, sensors for sensing environment, and users collaborating with each other, all co
Externí odkaz:
http://arxiv.org/abs/2407.15234
Autor:
Yu, Tianyuan, Ma, Xinyu, Patil, Varun, Kocaogullar, Yekta, Zhang, Yulong, Burke, Jeff, Kutscher, Dirk, Zhang, Lixia
This position paper explores how to support the Web's evolution through an underlying data-centric approach that better matches the data-orientedness of modern and emerging applications. We revisit the original vision of the Web as a hypermedia syste
Externí odkaz:
http://arxiv.org/abs/2407.15221
Despite significant progress in model editing methods, their application in real-world scenarios remains challenging as they often cause large language models (LLMs) to collapse. Among them, ROME is particularly concerning, as it could disrupt LLMs w
Externí odkaz:
http://arxiv.org/abs/2406.11263
Autor:
Ma, Xinyu, Liu, Xuebo, Wong, Derek F., Rao, Jun, Li, Bei, Ding, Liang, Chao, Lidia S., Tao, Dacheng, Zhang, Min
Multimodal machine translation (MMT) is a challenging task that seeks to improve translation quality by incorporating visual information. However, recent studies have indicated that the visual information provided by existing MMT datasets is insuffic
Externí odkaz:
http://arxiv.org/abs/2404.18413
Parameter-efficient fine-tuning methods, represented by LoRA, play an essential role in adapting large-scale pre-trained models to downstream tasks. However, fine-tuning LoRA-series models also faces the risk of overfitting on the training dataset, a
Externí odkaz:
http://arxiv.org/abs/2404.09610
With the increasingly powerful performances and enormous scales of pretrained models, promoting parameter efficiency in fine-tuning has become a crucial need for effective and efficient adaptation to various downstream tasks. One representative line
Externí odkaz:
http://arxiv.org/abs/2404.04316
Although model editing has shown promise in revising knowledge in Large Language Models (LLMs), its impact on the inherent capabilities of LLMs is often overlooked. In this work, we reveal a critical phenomenon: even a single edit can trigger model c
Externí odkaz:
http://arxiv.org/abs/2402.09656