Zobrazeno 1 - 10
of 1 066
pro vyhledávání: '"Ranathunga AN"'
Autor:
Rathnayake, Charitha, Thilakarathna, P. R. S., Nethmini, Uthpala, Kaur, Rishemjith, Ranathunga, Surangika
Bilingual lexicons play a crucial role in various Natural Language Processing tasks. However, many low-resource languages (LRLs) do not have such lexicons, and due to the same reason, cannot benefit from the supervised Bilingual Lexicon Induction (BL
Externí odkaz:
http://arxiv.org/abs/2412.16894
In this paper, we address the challenge of recipe personalization through ingredient substitution. We make use of Large Language Models (LLMs) to build an ingredient substitution system designed to predict plausible substitute ingredients within a gi
Externí odkaz:
http://arxiv.org/abs/2412.04922
Autor:
Ranathunga, Surangika, Sirithunga, Rumesh, Rathnayake, Himashi, De Silva, Lahiru, Aluthwala, Thamindu, Peramuna, Saman, Shekhar, Ravi
Text Simplification is a task that has been minimally explored for low-resource languages. Consequently, there are only a few manually curated datasets. In this paper, we present a human curated sentence-level text simplification dataset for the Sinh
Externí odkaz:
http://arxiv.org/abs/2412.01293
Autor:
Ranathunga, Surangika, Ranasinghea, Asanka, Shamala, Janaka, Dandeniyaa, Ayodya, Galappaththia, Rashmi, Samaraweeraa, Malithi
This paper presents a multi-way parallel English-Tamil-Sinhala corpus annotated with Named Entities (NEs), where Sinhala and Tamil are low-resource languages. Using pre-trained multilingual Language Models (mLMs), we establish new benchmark Named Ent
Externí odkaz:
http://arxiv.org/abs/2412.02056
Autor:
Ahamed, Ishrath, Ranathunga, Chamith Dilshan, Udayantha, Dinuka Sandun, Ng, Benny Kai Kiat, Yuen, Chau
Accurate people counting in smart buildings and intelligent transportation systems is crucial for energy management, safety protocols, and resource allocation. This is especially critical during emergencies, where precise occupant counts are vital fo
Externí odkaz:
http://arxiv.org/abs/2411.10072
Transfer Learning on Transformers for Building Energy Consumption Forecasting -- A Comparative Study
This study investigates the application of Transfer Learning (TL) on Transformer architectures to enhance building energy consumption forecasting. Transformers are a relatively new deep learning architecture, which has served as the foundation for gr
Externí odkaz:
http://arxiv.org/abs/2410.14107
Autor:
Du, Jiangshu, Wang, Yibo, Zhao, Wenting, Deng, Zhongfen, Liu, Shuaiqi, Lou, Renze, Zou, Henry Peng, Venkit, Pranav Narayanan, Zhang, Nan, Srinath, Mukund, Zhang, Haoran Ranran, Gupta, Vipul, Li, Yinghui, Li, Tao, Wang, Fei, Liu, Qin, Liu, Tianlin, Gao, Pengzhi, Xia, Congying, Xing, Chen, Cheng, Jiayang, Wang, Zhaowei, Su, Ying, Shah, Raj Sanjay, Guo, Ruohao, Gu, Jing, Li, Haoran, Wei, Kangda, Wang, Zihao, Cheng, Lu, Ranathunga, Surangika, Fang, Meng, Fu, Jie, Liu, Fei, Huang, Ruihong, Blanco, Eduardo, Cao, Yixin, Zhang, Rui, Yu, Philip S., Yin, Wenpeng
This work is motivated by two key trends. On one hand, large language models (LLMs) have shown remarkable versatility in various generative tasks such as writing, drawing, and question answering, significantly reducing the time required for many rout
Externí odkaz:
http://arxiv.org/abs/2406.16253
We analysed a sample of NLP research papers archived in ACL Anthology as an attempt to quantify the degree of openness and the benefit of such an open culture in the NLP community. We observe that papers published in different NLP venues show differe
Externí odkaz:
http://arxiv.org/abs/2406.06021
Autor:
Susnjak, Teo, Hwang, Peter, Reyes, Napoleon H., Barczak, Andre L. C., McIntosh, Timothy R., Ranathunga, Surangika
This research pioneers the use of fine-tuned Large Language Models (LLMs) to automate Systematic Literature Reviews (SLRs), presenting a significant and novel contribution in integrating AI to enhance academic research methodologies. Our study employ
Externí odkaz:
http://arxiv.org/abs/2404.08680
Autor:
Su, Tong, Peng, Xin, Thillainathan, Sarubi, Guzmán, David, Ranathunga, Surangika, Lee, En-Shiun Annie
Parameter-efficient fine-tuning (PEFT) methods are increasingly vital in adapting large-scale pre-trained language models for diverse tasks, offering a balance between adaptability and computational efficiency. They are important in Low-Resource Lang
Externí odkaz:
http://arxiv.org/abs/2404.04212