Zobrazeno 1 - 10
of 205
pro vyhledávání: '"garg, Isha"'
In this work, we study how well the learned weights of a neural network utilize the space available to them. This notion is related to capacity, but additionally incorporates the interaction of the network architecture with the dataset. Most learned
Externí odkaz:
http://arxiv.org/abs/2407.04797
Deep learning has proved successful in many applications but suffers from high computational demands and requires custom accelerators for deployment. Crossbar-based analog in-memory architectures are attractive for acceleration of deep neural network
Externí odkaz:
http://arxiv.org/abs/2403.13082
Deep neural networks are over-parameterized and easily overfit the datasets they train on. In the extreme case, it has been shown that these networks can memorize a training set with fully randomized labels. We propose using the curvature of loss fun
Externí odkaz:
http://arxiv.org/abs/2307.05831
TOFU: Towards Obfuscated Federated Updates by Encoding Weight Updates into Gradients from Proxy Data
Advances in Federated Learning and an abundance of user data have enabled rich collaborative learning between multiple clients, without sharing user data. This is done via a central server that aggregates learning in the form of weight updates. Howev
Externí odkaz:
http://arxiv.org/abs/2201.08494
Over the past decade, deep neural networks have proven to be adept in image classification tasks, often surpassing humans in terms of accuracy. However, standard neural networks often fail to understand the concept of hierarchical structures and depe
Externí odkaz:
http://arxiv.org/abs/2112.10844
Spiking Neural Networks (SNNs) are a promising alternative to traditional deep learning methods since they perform event-driven information processing. However, a major drawback of SNNs is high inference latency. The efficiency of SNNs could be enhan
Externí odkaz:
http://arxiv.org/abs/2104.12528
Publikováno v:
International Conference on Learning Representations (ICLR), 2021
The ability to learn continually without forgetting the past tasks is a desired attribute for artificial learning systems. Existing approaches to enable such learning in artificial neural networks usually rely on network growth, importance based weig
Externí odkaz:
http://arxiv.org/abs/2103.09762
Deep neural networks have found widespread adoption in solving complex tasks ranging from image recognition to natural language processing. However, these networks make confident mispredictions when presented with data that does not belong to the tra
Externí odkaz:
http://arxiv.org/abs/2012.08398
Spiking Neural Networks (SNNs) offer a promising alternative to traditional deep learning frameworks, since they provide higher computational efficiency due to event-driven information processing. SNNs distribute the analog values of pixel intensitie
Externí odkaz:
http://arxiv.org/abs/2010.01795
Deep Learning models hold state-of-the-art performance in many fields, but their vulnerability to adversarial examples poses threat to their ubiquitous deployment in practical settings. Additionally, adversarial inputs generated on one classifier hav
Externí odkaz:
http://arxiv.org/abs/2008.01524