Zobrazeno 1 - 4
of 4
pro vyhledávání: '"Merlin, Gabriele"'
The pretrain-finetune paradigm usually improves downstream performance over training a model from scratch on the same task, becoming commonplace across many areas of machine learning. While pretraining is empirically observed to be beneficial for a r
Externí odkaz:
http://arxiv.org/abs/2307.06006
Autor:
Merlin, Gabriele, Toneva, Mariya
Pretrained language models have been shown to significantly predict brain recordings of people comprehending language. Recent work suggests that the prediction of the next word is a key mechanism that contributes to this alignment. What is not yet un
Externí odkaz:
http://arxiv.org/abs/2212.00596
Publikováno v:
ICIAP 2022 Workshops
Continual Learning requires the model to learn from a stream of dynamic, non-stationary data without forgetting previous knowledge. Several approaches have been developed in the literature to tackle the Continual Learning challenge. Among them, Repla
Externí odkaz:
http://arxiv.org/abs/2203.10317
In this poster, we present a visualization tool for the in-depth analysis of domestic electricity consumption. The web-interface allows users to visualize their electricity consumption, compare them with their own records or with the means of selecte
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_________::84920eba247a21797f6dad2a7e4747f2