Zobrazeno 1 - 6
of 6
pro vyhledávání: '"Merlin, Gabriele"'
The pretrain-finetune paradigm usually improves downstream performance over training a model from scratch on the same task, becoming commonplace across many areas of machine learning. While pretraining is empirically observed to be beneficial for a r
Externí odkaz:
http://arxiv.org/abs/2307.06006
Autor:
Merlin, Gabriele, Toneva, Mariya
Pretrained language models have been shown to significantly predict brain recordings of people comprehending language. Recent work suggests that the prediction of the next word is a key mechanism that contributes to this alignment. What is not yet un
Externí odkaz:
http://arxiv.org/abs/2212.00596
Publikováno v:
ICIAP 2022 Workshops
Continual Learning requires the model to learn from a stream of dynamic, non-stationary data without forgetting previous knowledge. Several approaches have been developed in the literature to tackle the Continual Learning challenge. Among them, Repla
Externí odkaz:
http://arxiv.org/abs/2203.10317
In this poster, we present a visualization tool for the in-depth analysis of domestic electricity consumption. The web-interface allows users to visualize their electricity consumption, compare them with their own records or with the means of selecte
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_________::84920eba247a21797f6dad2a7e4747f2
Publikováno v:
Capital Humano. jul/ago2017, Vol. 30 Issue 322, p44-55. 12p.
The two-volume set LNCS 13373 and 13374 constitutes the papers of several workshops which were held in conjunction with the 21st International Conference on Image Analysis and Processing, ICIAP 2022, held in Lecce, Italy, in May 2022.The 96 revised f