Zobrazeno 1 - 3
of 3
pro vyhledávání: '"Walsh, Evan Pete"'
Autor:
Magnusson, Ian, Bhagia, Akshita, Hofmann, Valentin, Soldaini, Luca, Jha, Ananya Harsh, Tafjord, Oyvind, Schwenk, Dustin, Walsh, Evan Pete, Elazar, Yanai, Lo, Kyle, Groeneveld, Dirk, Beltagy, Iz, Hajishirzi, Hannaneh, Smith, Noah A., Richardson, Kyle, Dodge, Jesse
Language models (LMs) commonly report perplexity on monolithic data held out from training. Implicitly or explicitly, this data is composed of domains$\unicode{x2013}$varying distributions of language. Rather than assuming perplexity on one distribut
Externí odkaz:
http://arxiv.org/abs/2312.10523
Autor:
Peng, Hao, Cao, Qingqing, Dodge, Jesse, Peters, Matthew E., Fernandez, Jared, Sherborne, Tom, Lo, Kyle, Skjonsberg, Sam, Strubell, Emma, Plessas, Darrell, Beltagy, Iz, Walsh, Evan Pete, Smith, Noah A., Hajishirzi, Hannaneh
Rising computational demands of modern natural language processing (NLP) systems have increased the barrier to entry for cutting-edge research while posing serious environmental concerns. Yet, progress on model efficiency has been impeded by practica
Externí odkaz:
http://arxiv.org/abs/2307.09701
Autor:
Jha, Ananya Harsh, Sherborne, Tom, Walsh, Evan Pete, Groeneveld, Dirk, Strubell, Emma, Beltagy, Iz
Large language models (LLMs) enable unparalleled few- and zero-shot reasoning capabilities but at a high computational footprint. A growing assortment of methods for compression promises to reduce the computational burden of LLMs in deployment, but s
Externí odkaz:
http://arxiv.org/abs/2305.14864