Zobrazeno 1 - 10
of 138
pro vyhledávání: '"Wibowo Haryo"'
Autor:
Yandri Erkata, Pramono Kukuh Priyo, Sihombing Very, Effendi Luqmanul Hakim, Ardianto Denis, Setyobudi Roy Hendroko, Suherman Suherman, Wahono Satriyo Krido, Wibowo Haryo, Garfansa Marchel Putra, Farzana Afrida Rizka
Publikováno v:
BIO Web of Conferences, Vol 104, p 00012 (2024)
Energy Management Systems (EMS) have become increasingly important in efforts to address global energy challenges, such as increasing energy demand and climate change. EMS can be used to improve energy efficiency; reduce greenhouse gas emissions; and
Externí odkaz:
https://doaj.org/article/36ee6a4bd065438aa28422077f27a28f
Autor:
Winata, Genta Indra, Hudi, Frederikus, Irawan, Patrick Amadeus, Anugraha, David, Putri, Rifki Afina, Wang, Yutong, Nohejl, Adam, Prathama, Ubaidillah Ariq, Ousidhoum, Nedjma, Amriani, Afifa, Rzayev, Anar, Das, Anirban, Pramodya, Ashmari, Adila, Aulia, Wilie, Bryan, Mawalim, Candy Olivia, Cheng, Ching Lam, Abolade, Daud, Chersoni, Emmanuele, Santus, Enrico, Ikhwantri, Fariz, Kuwanto, Garry, Zhao, Hanyang, Wibowo, Haryo Akbarianto, Lovenia, Holy, Cruz, Jan Christian Blaise, Putra, Jan Wira Gotama, Myung, Junho, Susanto, Lucky, Machin, Maria Angelica Riera, Zhukova, Marina, Anugraha, Michael, Adilazuarda, Muhammad Farid, Santosa, Natasha, Limkonchotiwat, Peerat, Dabre, Raj, Audino, Rio Alexander, Cahyawijaya, Samuel, Zhang, Shi-Xiong, Salim, Stephanie Yulia, Zhou, Yi, Gui, Yinxuan, Adelani, David Ifeoluwa, Lee, En-Shiun Annie, Okada, Shogo, Purwarianti, Ayu, Aji, Alham Fikri, Watanabe, Taro, Wijaya, Derry Tanti, Oh, Alice, Ngo, Chong-Wah
Vision Language Models (VLMs) often struggle with culture-specific knowledge, particularly in languages other than English and in underrepresented cultural contexts. To evaluate their understanding of such knowledge, we introduce WorldCuisines, a mas
Externí odkaz:
http://arxiv.org/abs/2410.12705
Knowledge distillation (KD) has proven to be a successful strategy to improve the performance of a smaller model in many NLP tasks. However, most of the work in KD only explores monolingual scenarios. In this paper, we investigate the value of KD in
Externí odkaz:
http://arxiv.org/abs/2406.16524
Autor:
Romero, David, Lyu, Chenyang, Wibowo, Haryo Akbarianto, Lynn, Teresa, Hamed, Injy, Kishore, Aditya Nanda, Mandal, Aishik, Dragonetti, Alina, Abzaliev, Artem, Tonja, Atnafu Lambebo, Balcha, Bontu Fufa, Whitehouse, Chenxi, Salamea, Christian, Velasco, Dan John, Adelani, David Ifeoluwa, Meur, David Le, Villa-Cueva, Emilio, Koto, Fajri, Farooqui, Fauzan, Belcavello, Frederico, Batnasan, Ganzorig, Vallejo, Gisela, Caulfield, Grainne, Ivetta, Guido, Song, Haiyue, Ademtew, Henok Biadglign, Maina, Hernán, Lovenia, Holy, Azime, Israel Abebe, Cruz, Jan Christian Blaise, Gala, Jay, Geng, Jiahui, Ortiz-Barajas, Jesus-German, Baek, Jinheon, Dunstan, Jocelyn, Alemany, Laura Alonso, Nagasinghe, Kumaranage Ravindu Yasas, Benotti, Luciana, D'Haro, Luis Fernando, Viridiano, Marcelo, Estecha-Garitagoitia, Marcos, Cabrera, Maria Camila Buitrago, Rodríguez-Cantelar, Mario, Jouitteau, Mélanie, Mihaylov, Mihail, Imam, Mohamed Fazli Mohamed, Adilazuarda, Muhammad Farid, Gochoo, Munkhjargal, Otgonbold, Munkh-Erdene, Etori, Naome, Niyomugisha, Olivier, Silva, Paula Mónica, Chitale, Pranjal, Dabre, Raj, Chevi, Rendi, Zhang, Ruochen, Diandaru, Ryandito, Cahyawijaya, Samuel, Góngora, Santiago, Jeong, Soyeong, Purkayastha, Sukannya, Kuribayashi, Tatsuki, Jayakumar, Thanmay, Torrent, Tiago Timponi, Ehsan, Toqeer, Araujo, Vladimir, Kementchedjhieva, Yova, Burzo, Zara, Lim, Zheng Wei, Yong, Zheng Xin, Ignat, Oana, Nwatu, Joan, Mihalcea, Rada, Solorio, Thamar, Aji, Alham Fikri
Visual Question Answering (VQA) is an important task in multimodal AI, and it is often used to test the ability of vision-language models to understand and reason on knowledge present in both visual and textual data. However, most of the current VQA
Externí odkaz:
http://arxiv.org/abs/2406.05967
Publikováno v:
BIO Web of Conferences, Vol 62, p 02001 (2023)
In this study, the effect of activation procedures on the ammonia adsorption of BFA was investigated. BFA was activated by chemical and physical methods, and the adsorption capacity and surface properties of BFA were analyzed. The results showed that
Externí odkaz:
https://doaj.org/article/492a4202786e4f9981b91818046a9474
Autor:
Wibowo, Haryo Akbarianto, Fuadi, Erland Hilman, Nityasya, Made Nindyatama, Prasojo, Radityo Eko, Aji, Alham Fikri
We present COPAL-ID, a novel, public Indonesian language common sense reasoning dataset. Unlike the previous Indonesian COPA dataset (XCOPA-ID), COPAL-ID incorporates Indonesian local and cultural nuances, and therefore, provides a more natural portr
Externí odkaz:
http://arxiv.org/abs/2311.01012
Autor:
Yandri Erkata, Novianto Bangun, Fridolini Fridolini, Hendroko Setyabudi Roy, Wibowo Haryo, Krido Wahono Satriyo, Abdullah Kamaruddin, Purba Washington, Adhi Nugroho Yogo
Publikováno v:
E3S Web of Conferences, Vol 226, p 00015 (2021)
The purpose of this study is to conceptualize an urban Hi-Tech Cook-Stove (HTCS) design using agricultural waste. Several steps need to be carried out. First, determine the cooking activities depend on the family size and food categories. Second, cal
Externí odkaz:
https://doaj.org/article/8620d188369944b7bc6134fe79e6d0ea
Autor:
Nichita Kaikatui Rapha, Putra Andika Adik, Letsoin Vinsenius, Mangera Paulus, Hardiantono Damis, Adhi Nugroho Yogo, Susanto Herry, Wibowo Haryo
Publikováno v:
E3S Web of Conferences, Vol 190, p 00032 (2020)
Energy demand increases in line with rapid technological advances. Research on the harvesting of renewable energy continues to be done to make efforts to convert heat energy, which is very abundant in our daily environment. Thermoelectric technology
Externí odkaz:
https://doaj.org/article/4f3b30c58ed24b1dae102a8b7448b2fb
Autor:
Nityasya, Made Nindyatama, Wibowo, Haryo Akbarianto, Aji, Alham Fikri, Winata, Genta Indra, Prasojo, Radityo Eko, Blunsom, Phil, Kuncoro, Adhiguna
This evidence-based position paper critiques current research practices within the language model pre-training literature. Despite rapid recent progress afforded by increasingly better pre-trained language models (PLMs), current PLM research practice
Externí odkaz:
http://arxiv.org/abs/2306.02870
Autor:
Nityasya, Made Nindyatama, Wibowo, Haryo Akbarianto, Chevi, Rendi, Prasojo, Radityo Eko, Aji, Alham Fikri
We perform knowledge distillation (KD) benchmark from task-specific BERT-base teacher models to various student models: BiLSTM, CNN, BERT-Tiny, BERT-Mini, and BERT-Small. Our experiment involves 12 datasets grouped in two tasks: text classification a
Externí odkaz:
http://arxiv.org/abs/2201.00558