Zobrazeno 1 - 10
of 402
pro vyhledávání: '"A. Patacchiola"'
In Few-Shot Learning (FSL), models are trained to recognise unseen objects from a query set, given a few labelled examples from a support set. In standard FSL, models are evaluated on query instances sampled from the same class distribution of the su
Externí odkaz:
http://arxiv.org/abs/2408.02052
Density estimation, a central problem in machine learning, can be performed using Normalizing Flows (NFs). NFs comprise a sequence of invertible transformations, that turn a complex target distribution into a simple one, by exploiting the change of v
Externí odkaz:
http://arxiv.org/abs/2401.01855
In this paper we explore few-shot imitation learning for control problems, which involves learning to imitate a target policy by accessing a limited set of offline rollouts. This setting has been relatively under-explored despite its relevance to rob
Externí odkaz:
http://arxiv.org/abs/2306.13554
Autor:
Patacchiola, Massimiliano, Bronskill, John, Shysheya, Aliaksandra, Hofmann, Katja, Nowozin, Sebastian, Turner, Richard E.
Recent years have seen a growth in user-centric applications that require effective knowledge transfer across tasks in the low-data regime. An example is personalization, where a pretrained system is adapted by learning on small amounts of labeled da
Externí odkaz:
http://arxiv.org/abs/2206.09843
Autor:
Shysheya, Aliaksandra, Bronskill, John, Patacchiola, Massimiliano, Nowozin, Sebastian, Turner, Richard E
Publikováno v:
The Eleventh International Conference on Learning Representations (ICLR 2023)
Modern deep learning systems are increasingly deployed in situations such as personalization and federated learning where it is necessary to support i) learning on small amounts of data, and ii) communication efficient distributed training protocols.
Externí odkaz:
http://arxiv.org/abs/2206.08671
Autor:
Sendera, Marcin, Tabor, Jacek, Nowak, Aleksandra, Bedychaj, Andrzej, Patacchiola, Massimiliano, Trzciński, Tomasz, Spurek, Przemysław, Zięba, Maciej
Gaussian Processes (GPs) have been widely used in machine learning to model distributions over functions, with applications including multi-modal regression, time-series prediction, and few-shot learning. GPs are particularly useful in the last appli
Externí odkaz:
http://arxiv.org/abs/2110.13561
Autor:
Bronskill, John, Massiceti, Daniela, Patacchiola, Massimiliano, Hofmann, Katja, Nowozin, Sebastian, Turner, Richard E.
Publikováno v:
35th Conference on Neural Information Processing Systems (NeurIPS 2021)
Meta learning approaches to few-shot classification are computationally efficient at test time, requiring just a few optimization steps or single forward pass to learn a new task, but they remain highly memory-intensive to train. This limitation aris
Externí odkaz:
http://arxiv.org/abs/2107.01105
Meta-Learning (ML) has proven to be a useful tool for training Few-Shot Learning (FSL) algorithms by exposure to batches of tasks sampled from a meta-dataset. However, the standard training procedure overlooks the dynamic nature of the real-world whe
Externí odkaz:
http://arxiv.org/abs/2104.05344
Few-Shot Learning (FSL) algorithms are commonly trained through Meta-Learning (ML), which exposes models to batches of tasks sampled from a meta-dataset to mimic tasks seen during evaluation. However, the standard training procedures overlook the rea
Externí odkaz:
http://arxiv.org/abs/2101.02523
In self-supervised learning, a system is tasked with achieving a surrogate objective by defining alternative targets on a set of unlabeled data. The aim is to build useful representations that can be used in downstream tasks, without costly manual an
Externí odkaz:
http://arxiv.org/abs/2006.05849