Increasing biases can be more efficient than increasing weights

Autor: Metta, Carlo, Fantozzi, Marco, Papini, Andrea, Amato, Gianluca, Bergamaschi, Matteo, Galfrè, Silvia Giulia, Marchetti, Alessandro, Vegliò, Michelangelo, Parton, Maurizio, Morandin, Francesco
Rok vydání: 2023
Předmět:
Druh dokumentu: Working Paper
Popis: We introduce a novel computational unit for neural networks that features multiple biases, challenging the traditional perceptron structure. This unit emphasizes the importance of preserving uncorrupted information as it is passed from one unit to the next, applying activation functions later in the process with specialized biases for each unit. Through both empirical and theoretical analyses, we show that by focusing on increasing biases rather than weights, there is potential for significant enhancement in a neural network model's performance. This approach offers an alternative perspective on optimizing information flow within neural networks. See source code at https://github.com/CuriosAI/dac-dev.
Comment: Major rewriting. Supersedes v1 and v2. Focusing on the fact that not all parameters are born equal: biases can be more important than weights. Accordingly, new title and new abstract, and many more experiments on fully connected architectures. This is the extended version of the paper published at WACV 2024
Databáze: arXiv