Zobrazeno 1 - 10
of 11 792
pro vyhledávání: '"SUBBA, REDDY"'
Transformer-based models have revolutionized the field of natural language processing. To understand why they perform so well and to assess their reliability, several studies have focused on questions such as: Which linguistic properties are encoded
Externí odkaz:
http://arxiv.org/abs/2410.02611
Identifying user's opinions and stances in long conversation threads on various topics can be extremely critical for enhanced personalization, market research, political campaigns, customer service, conflict resolution, targeted advertising, and cont
Externí odkaz:
http://arxiv.org/abs/2406.16833
Autor:
Pantokratoras, Asterios
In the above paper the authors treat the natural convection boundary layer flow in a Darcy-Brinkman-Forchheimer porous medium. In the energy equation the radiation effect has been taken into account. A two-parameter perturbation method is used for th
Externí odkaz:
http://arxiv.org/abs/0708.2713
Despite known differences between reading and listening in the brain, recent work has shown that text-based language models predict both text-evoked and speech-evoked brain activity to an impressive degree. This poses the question of what types of in
Externí odkaz:
http://arxiv.org/abs/2311.04664
Publikováno v:
Organic Synthesis : State of the Art, 2013-2015, 2017.
Externí odkaz:
https://doi.org/10.1093/oso/9780190646165.003.0058
Autor:
Oota, Subba Reddy, Chen, Zijiao, Gupta, Manish, Bapi, Raju S., Jobard, Gael, Alexandre, Frederic, Hinaut, Xavier
Can we obtain insights about the brain using AI models? How is the information in deep learning models related to brain recordings? Can we improve AI models with the help of brain recordings? Such questions can be tackled by studying brain recordings
Externí odkaz:
http://arxiv.org/abs/2307.10246
Autor:
Neerudu, Pavan Kalyan Reddy, Oota, Subba Reddy, Marreddy, Mounika, Kagita, Venkateswara Rao, Gupta, Manish
Transformer-based pretrained models like BERT, GPT-2 and T5 have been finetuned for a large number of natural language processing (NLP) tasks, and have been shown to be very effective. However, while finetuning, what changes across layers in these mo
Externí odkaz:
http://arxiv.org/abs/2305.14453
Autor:
Jyothi, Koganti Krishna1 (AUTHOR), Borra, Subba Reddy2 (AUTHOR), Srilakshmi, Koganti3 (AUTHOR), Balachandran, Praveen Kumar4 (AUTHOR), Reddy, Ganesh Prasad5 (AUTHOR), Colak, Ilhami6 (AUTHOR), Dhanamjayulu, C.7 (AUTHOR) dhanamjayulu.c@vit.ac.in, Chinthaginjala, Ravikumar7 (AUTHOR), Khan, Baseem8 (AUTHOR) baseemkh@hu.edu.et
Publikováno v:
Scientific Reports. 8/8/2024, Vol. 14 Issue 1, p1-11. 11p.
Syntactic parsing is the task of assigning a syntactic structure to a sentence. There are two popular syntactic parsing methods: constituency and dependency parsing. Recent works have used syntactic embeddings based on constituency trees, incremental
Externí odkaz:
http://arxiv.org/abs/2302.08589
Publikováno v:
Tetrahedron Chem, Vol 12, Iss , Pp 100104- (2024)
A novel palladium(II) catalyzed C–H annulation strategy has been developed for the synthesis of a diverse range of furo[3′,4':5,6]pyrido[3,4-b]indole-1,5(3H)-dione scaffolds. This is the first report on the construction of polycyclic carboline fr
Externí odkaz:
https://doaj.org/article/3744da7bc451470ab26957fd9a458fc6