Zobrazeno 1 - 10
of 1 092
pro vyhledávání: '"Shamsian A"'
Large transformer-based models have significant potential for speech transcription and translation. Their self-attention mechanisms and parallel processing enable them to capture complex patterns and dependencies in audio sequences. However, this pot
Externí odkaz:
http://arxiv.org/abs/2409.15869
Integrating named entity recognition (NER) with automatic speech recognition (ASR) can significantly enhance transcription accuracy and informativeness. In this paper, we introduce WhisperNER, a novel model that allows joint speech transcription and
Externí odkaz:
http://arxiv.org/abs/2409.08107
Automatic Speech Recognition (ASR) technology has made significant progress in recent years, providing accurate transcription across various domains. However, some challenges remain, especially in noisy environments and specialized jargon. In this pa
Externí odkaz:
http://arxiv.org/abs/2406.02649
Publikováno v:
International Journal of Nanomedicine, Vol Volume 15, Pp 4063-4078 (2020)
Azam Shamsian,1,2 Mohammad Reza Sepand,3 Marziye Javaheri Kachousangi,1,2 Tahereh Dara,4 Seyed Nasser Ostad,3 Fatemeh Atyabi,1,2 Mohammad Hossein Ghahremani1,3 1Nanotechnology Research Center, Faculty of Pharmacy, Tehran University of Medical Science
Externí odkaz:
https://doaj.org/article/b4650ada388f487aa13cc5d87add8265
Standard federated learning approaches suffer when client data distributions have sufficient heterogeneity. Recent methods addressed the client data heterogeneity issue via personalized federated learning (PFL) - a class of FL algorithms aiming to pe
Externí odkaz:
http://arxiv.org/abs/2404.02478
One of the challenges in applying reinforcement learning in a complex real-world environment lies in providing the agent with a sufficiently detailed reward function. Any misalignment between the reward and the desired behavior can result in unwanted
Externí odkaz:
http://arxiv.org/abs/2402.11367
Autor:
Shamsian, Aviv, Navon, Aviv, Zhang, David W., Zhang, Yan, Fetaya, Ethan, Chechik, Gal, Maron, Haggai
Learning in deep weight spaces (DWS), where neural networks process the weights of other neural networks, is an emerging research direction, with applications to 2D and 3D neural fields (INRs, NeRFs), as well as making inferences about other types of
Externí odkaz:
http://arxiv.org/abs/2402.04081
Autor:
Shamsian, Aviv, Zhang, David W., Navon, Aviv, Zhang, Yan, Kofinas, Miltiadis, Achituve, Idan, Valperga, Riccardo, Burghouts, Gertjan J., Gavves, Efstratios, Snoek, Cees G. M., Fetaya, Ethan, Chechik, Gal, Maron, Haggai
Learning in weight spaces, where neural networks process the weights of other deep neural networks, has emerged as a promising research direction with applications in various fields, from analyzing and editing neural fields and implicit neural repres
Externí odkaz:
http://arxiv.org/abs/2311.08851
Autor:
Eitan, Daniel, Pirchi, Menachem, Glazer, Neta, Meital, Shai, Ayach, Gil, Krendel, Gidon, Shamsian, Aviv, Navon, Aviv, Hetz, Gil, Keshet, Joseph
General purpose language models (LMs) encounter difficulties when processing domain-specific jargon and terminology, which are frequently utilized in specialized fields such as medicine or industrial settings. Moreover, they often find it challenging
Externí odkaz:
http://arxiv.org/abs/2310.19708
Permutation symmetries of deep networks make basic operations like model merging and similarity estimation challenging. In many cases, aligning the weights of the networks, i.e., finding optimal permutations between their weights, is necessary. Unfor
Externí odkaz:
http://arxiv.org/abs/2310.13397