Zobrazeno 1 - 10
of 121
pro vyhledávání: '"SAVIN, Cristina"'
This perspective piece is the result of a Generative Adversarial Collaboration (GAC) tackling the question `How does neural activity represent probability distributions?'. We have addressed three major obstacles to progress on answering this question
Externí odkaz:
http://arxiv.org/abs/2409.02709
Autor:
Bredenberg, Colin, Savin, Cristina
Normative models of synaptic plasticity use a combination of mathematics and computational simulations to arrive at predictions of behavioral and network-level adaptive phenomena. In recent years, there has been an explosion of theoretical work on th
Externí odkaz:
http://arxiv.org/abs/2308.04988
Autor:
Zador, Anthony, Escola, Sean, Richards, Blake, Ölveczky, Bence, Bengio, Yoshua, Boahen, Kwabena, Botvinick, Matthew, Chklovskii, Dmitri, Churchland, Anne, Clopath, Claudia, DiCarlo, James, Ganguli, Surya, Hawkins, Jeff, Koerding, Konrad, Koulakov, Alexei, LeCun, Yann, Lillicrap, Timothy, Marblestone, Adam, Olshausen, Bruno, Pouget, Alexandre, Savin, Cristina, Sejnowski, Terrence, Simoncelli, Eero, Solla, Sara, Sussillo, David, Tolias, Andreas S., Tsao, Doris
Neuroscience has long been an essential driver of progress in artificial intelligence (AI). We propose that to accelerate progress in AI, we must invest in fundamental research in NeuroAI. A core component of this is the embodied Turing test, which c
Externí odkaz:
http://arxiv.org/abs/2210.08340
Latent manifolds provide a compact characterization of neural population activity and of shared co-variability across brain areas. Nonetheless, existing statistical tools for extracting neural manifolds face limitations in terms of interpretability o
Externí odkaz:
http://arxiv.org/abs/2209.02816
Autor:
Prince, Luke Y., Eyono, Roy Henha, Boven, Ellen, Ghosh, Arna, Pemberton, Joe, Scherr, Franz, Clopath, Claudia, Costa, Rui Ponte, Maass, Wolfgang, Richards, Blake A., Savin, Cristina, Wilmes, Katharina Anna
We provide a brief review of the common assumptions about biological learning with findings from experimental neuroscience and contrast them with the efficiency of gradient-based learning in recurrent neural networks. The key issues discussed in this
Externí odkaz:
http://arxiv.org/abs/2105.05382
Conventional hyperparameter optimization methods are computationally intensive and hard to generalize to scenarios that require dynamically adapting hyperparameters, such as life-long learning. Here, we propose an online hyperparameter optimization a
Externí odkaz:
http://arxiv.org/abs/2102.07813
We present a framework for compactly summarizing many recent results in efficient and/or biologically plausible online training of recurrent neural networks (RNN). The framework organizes algorithms according to several criteria: (a) past vs. future
Externí odkaz:
http://arxiv.org/abs/1907.02649
To learn useful dynamics on long time scales, neurons must use plasticity rules that account for long-term, circuit-wide effects of synaptic changes. In other words, neural circuits must solve a credit assignment problem to appropriately assign respo
Externí odkaz:
http://arxiv.org/abs/1905.12100
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.