Zobrazeno 1 - 10
of 18
pro vyhledávání: '"Li, Liam"'
Autor:
Shen, Junhong, Li, Liam, Dery, Lucio M., Staten, Corey, Khodak, Mikhail, Neubig, Graham, Talwalkar, Ameet
Fine-tuning large-scale pretrained models has led to tremendous progress in well-studied modalities such as vision and NLP. However, similar gains have not been observed in many other modalities due to a lack of relevant pretrained models. In this wo
Externí odkaz:
http://arxiv.org/abs/2302.05738
Autor:
Khodak, Mikhail, Tu, Renbo, Li, Tian, Li, Liam, Balcan, Maria-Florina, Smith, Virginia, Talwalkar, Ameet
Tuning hyperparameters is a crucial but arduous part of the machine learning pipeline. Hyperparameter optimization is even more challenging in federated learning, where models are learned over a distributed network of heterogeneous devices; here, the
Externí odkaz:
http://arxiv.org/abs/2106.04502
An important goal of AutoML is to automate-away the design of neural networks on new tasks in under-explored domains. Motivated by this goal, we study the problem of enabling users to discover the right neural operations given data from their specifi
Externí odkaz:
http://arxiv.org/abs/2103.15798
Meta-learning has enabled learning statistical models that can be quickly adapted to new prediction tasks. Motivated by use-cases in personalized federated learning, we study the often overlooked aspect of the modern meta-learning algorithms -- their
Externí odkaz:
http://arxiv.org/abs/2102.00127
Recent state-of-the-art methods for neural architecture search (NAS) exploit gradient-based optimization by relaxing the problem into continuous optimization over architectures and shared-weights, a noisy process that remains poorly understood. We ar
Externí odkaz:
http://arxiv.org/abs/2004.07802
Hyperparameter tuning of multi-stage pipelines introduces a significant computational burden. Motivated by the observation that work can be reused across pipelines if the intermediate computations are the same, we propose a pipeline-aware approach to
Externí odkaz:
http://arxiv.org/abs/1903.05176
Autor:
Li, Liam, Talwalkar, Ameet
Publikováno v:
Conference on Uncertainty in Artificial Intelligence (UAI), 2019
Neural architecture search (NAS) is a promising research direction that has the potential to replace expert-designed networks with learned, task-specific architectures. In this work, in order to help ground the empirical results in this field, we pro
Externí odkaz:
http://arxiv.org/abs/1902.07638
Autor:
Li, Liam, Jamieson, Kevin, Rostamizadeh, Afshin, Gonina, Ekaterina, Hardt, Moritz, Recht, Benjamin, Talwalkar, Ameet
Publikováno v:
Conference on Machine Learning and Systems 2020
Modern learning models are characterized by large hyperparameter spaces and long training times. These properties, coupled with the rise of parallel computing and the growing demand to productionize machine learning workloads, motivate the need to de
Externí odkaz:
http://arxiv.org/abs/1810.05934
Autor:
Li, Liam
Machine learning is widely used in a variety of different disciplines to develop predictive models for variables of interest. However, building such solutions is a timeconsuming and challenging discipline that requires highly trained data scientists
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::2de0fc81d377ecf048d34aa20a361889
Publikováno v:
ChemSusChem