Zobrazeno 1 - 10
of 19
pro vyhledávání: '"Parthasarathy, Nikhil"'
Data curation is an essential component of large-scale pretraining. In this work, we demonstrate that jointly selecting batches of data is more effective for learning than selecting examples independently. Multimodal contrastive objectives expose the
Externí odkaz:
http://arxiv.org/abs/2406.17711
Autor:
Kuoch, Michael, Chou, Chi-Ning, Parthasarathy, Nikhil, Dapello, Joel, DiCarlo, James J., Sompolinsky, Haim, Chung, SueYeon
Recently, growth in our understanding of the computations performed in both biological and artificial neural networks has largely been driven by either low-level mechanistic studies or global normative approaches. However, concrete methodologies for
Externí odkaz:
http://arxiv.org/abs/2312.14285
Publikováno v:
Transactions on Machine Learning Research, Jun 2024
Human ability to recognize complex visual patterns arises through transformations performed by successive areas in the ventral visual cortex. Deep neural networks trained end-to-end for object recognition approach human capabilities, and offer the be
Externí odkaz:
http://arxiv.org/abs/2312.11436
Autor:
Balažević, Ivana, Steiner, David, Parthasarathy, Nikhil, Arandjelović, Relja, Hénaff, Olivier J.
In-context learning$\unicode{x2013}$the ability to configure a model's behavior with different prompts$\unicode{x2013}$has revolutionized the field of natural language processing, alleviating the need for task-specific models and paving the way for g
Externí odkaz:
http://arxiv.org/abs/2306.01667
Humans learn powerful representations of objects and scenes by observing how they evolve over time. Yet, outside of specific tasks that require explicit temporal understanding, static image pretraining remains the dominant paradigm for learning visua
Externí odkaz:
http://arxiv.org/abs/2210.06433
Autor:
Koppula, Skanda, Li, Yazhe, Shelhamer, Evan, Jaegle, Andrew, Parthasarathy, Nikhil, Arandjelovic, Relja, Carreira, João, Hénaff, Olivier
Self-supervised methods have achieved remarkable success in transfer learning, often achieving the same or better accuracy than supervised pre-training. Most prior work has done so by increasing pre-training computation by adding complex data augment
Externí odkaz:
http://arxiv.org/abs/2209.15589
In this thesis, a novel approach is presented for blade loss simulation of an aircraft gas turbine rotor mounted on rolling element bearings with squeeze film dampers, seal rub and enclosed in a flexible housing. The modal truncation augmentation (MT
Externí odkaz:
http://hdl.handle.net/1969.1/189
We develop a model for representing visual texture in a low-dimensional feature space, along with a novel self-supervised learning objective that is used to train it on an unlabeled database of texture images. Inspired by the architecture of primate
Externí odkaz:
http://arxiv.org/abs/2006.16976
Autor:
Feinman, Reuben, Parthasarathy, Nikhil
Normalizing Flows are a promising new class of algorithms for unsupervised learning based on maximum likelihood optimization with change of variables. They offer to learn a factorized component representation for complex nonlinear data and, simultane
Externí odkaz:
http://arxiv.org/abs/1907.06496
Thesis (M.S.)--Texas A & M University, 2003.
"Major Subject: Mechanical Engineering." Title from author supplied metadata (automated record created on Apr. 30, 2004.). Vita. Abstract. Includes bibliographical references.
"Major Subject: Mechanical Engineering." Title from author supplied metadata (automated record created on Apr. 30, 2004.). Vita. Abstract. Includes bibliographical references.
Externí odkaz:
http://hdl.handle.net/1969/189