Zobrazeno 1 - 10
of 45
pro vyhledávání: '"Rahimi, Razieh"'
Autor:
Drozdov, Andrew, Zhuang, Honglei, Dai, Zhuyun, Qin, Zhen, Rahimi, Razieh, Wang, Xuanhui, Alon, Dana, Iyyer, Mohit, McCallum, Andrew, Metzler, Donald, Hui, Kai
Recent studies show that large language models (LLMs) can be instructed to effectively perform zero-shot passage re-ranking, in which the results of a first stage retrieval method, such as BM25, are rated and reordered to improve relevance. In this w
Externí odkaz:
http://arxiv.org/abs/2310.14408
Understanding why a model makes certain predictions is crucial when adapting it for real world decision making. LIME is a popular model-agnostic feature attribution method for the tasks of classification and regression. However, the task of learning
Externí odkaz:
http://arxiv.org/abs/2212.12722
Retrieval-enhanced language models (LMs), which condition their predictions on text retrieved from large external datastores, have recently shown significant perplexity improvements compared to standard LMs. One such approach, the $k$NN-LM, interpola
Externí odkaz:
http://arxiv.org/abs/2210.15859
We present GenEx, a generative model to explain search results to users beyond just showing matches between query and document words. Adding GenEx explanations to search results greatly impacts user satisfaction and search performance. Search engines
Externí odkaz:
http://arxiv.org/abs/2111.01314
Publikováno v:
In Proceedings of the 30th ACM International Conference on Information and Knowledge Management (CIKM '21), November 1-5, 2021, Virtual Event, QLD, Australia. ACM, New York, NY, USA, 5 page
Transformer-based rankers have shown state-of-the-art performance. However, their self-attention operation is mostly unable to process long sequences. One of the common approaches to train these rankers is to heuristically select some segments of eac
Externí odkaz:
http://arxiv.org/abs/2109.04611
Pretrained contextualized representations offer great success for many downstream tasks, including document ranking. The multilingual versions of such pretrained representations provide a possibility of jointly learning many languages with the same m
Externí odkaz:
http://arxiv.org/abs/2109.02789
Autor:
Foroozesh, Farzaneh, Monavari, Seyed Massoud, Salmanmahiny, Abdolrassoul, Robati, Maryam, Rahimi, Razieh
Publikováno v:
In Sustainable Cities and Society January 2022 76
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Publikováno v:
In Information Processing and Management March 2016 52(2):299-318
Autor:
Golrokhi, Raheleh, Manshadi, Seyed Ali Dehghan, SeyedAlinaghi, SeyedAhmad, Mohraz, Minoo, Jafarinasab, Masoud, Rahimi, Razieh, Dadras, Omid
Publikováno v:
HIV & AIDS Review; 2023, Vol. 22 Issue 3, p251-260, 10p