Zobrazeno 1 - 10
of 18
pro vyhledávání: '"Luyu Gao"'
Publikováno v:
Lecture Notes in Computer Science ISBN: 9783031282430
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_________::347ed03dc0e6c066ac2393ba0f059d8a
https://doi.org/10.1007/978-3-031-28244-7_19
https://doi.org/10.1007/978-3-031-28244-7_19
Autor:
Luyu Gao, Jamie Callan
Long document re-ranking has been a challenging problem for neural re-rankers based on deep language models like BERT. Early work breaks the documents into short passage-like chunks. These chunks are independently mapped to scalar scores or latent ve
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::6824a783a2e560ffd7d9d0510bb506ba
Publikováno v:
NAACL-HLT
Classical information retrieval systems such as BM25 rely on exact lexical match and carry out search efficiently with inverted list index. Recent neural IR models shifts towards soft semantic matching all query document terms, but they lose the comp
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::2e9340146b1bfc64271e4af4087f993b
http://arxiv.org/abs/2104.07186
http://arxiv.org/abs/2104.07186
Publikováno v:
Lecture Notes in Computer Science ISBN: 9783030721121
ECIR (1)
ECIR (1)
This paper presents clear, a retrieval model that seeks to complement classical lexical exact-match models such as BM25 with semantic matching signals from a neural embedding matching model.clear explicitly trains the neural embedding to encode langu
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_________::be620c7abbf3eaad82f4f5c5a0fe34c4
https://doi.org/10.1007/978-3-030-72113-8_10
https://doi.org/10.1007/978-3-030-72113-8_10
Autor:
Luyu Gao, Jamie Callan
Recent research demonstrates the effectiveness of using fine-tuned language models~(LM) for dense retrieval. However, dense retrievers are hard to train, typically requiring heavily engineered fine-tuning pipelines to realize their full potential. In
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::9a5a17acf289cbba7f8a70c6ffff3afe
Publikováno v:
Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021).
Contrastive learning has been applied successfully to learn vector representations of text. Previous research demonstrated that learning high-quality representations benefits from batch-wise contrastive loss with a large number of negatives. In pract
Publikováno v:
ICTIR
Deep language models such as BERT pre-trained on large corpus have given a huge performance boost to the state-of-the-art information retrieval ranking systems. Knowledge embedded in such models allows them to pick up complex matching signals between
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::4cf20612dd9a1242274d07d98f510e5a
http://arxiv.org/abs/2007.11088
http://arxiv.org/abs/2007.11088
Publikováno v:
Food Chemistry. 265:200-207
The goal of this study was to improve the chemical stability of menhaden oil and control the lipolysis in emulsions with whey protein during in vitro digestion through EGCG conjugation and genipin-mediated interfacial cross-linking (CL). WPI-EGCG con
Publikováno v:
Journal of Agricultural and Food Chemistry. 66:9481-9489
The effects of resveratrol (RES)-loaded whey protein isolate (WPI)-dextran nanocomplex on the physicochemical stability of β-carotene (BC) emulsions were evaluated. WPI-dextran was prepared by Maillard-based glycation and confirmed with gel electrop
Publikováno v:
Foodfunction. 11(2)
Resveratrol (RES)-loaded protein–polysaccharide nanoparticles were fabricated through simple electrostatic interactions with oppositely charged α-lactalbumin (ALA) and chitosan (CHI) with a mass ratio of 5 : 1 without the addition of NaCl at pH 6.