Popis: |
Resistor networks have recently attracted interest as analog computing platforms for machine learning, particularly due to their compatibility with the Equilibrium Propagation training framework. In this work, we explore the computational capabilities of these networks. We prove that electrical networks consisting of voltage sources, linear resistors, diodes, and voltage-controlled voltage sources (VCVS) can approximate any continuous function to arbitrary precision. Central to our proof is a method for translating a ReLU neural network into an approximately equivalent electrical network comprising these four elements. Our proof relies on two assumptions: (a) circuit elements are ideal, and (b) variable resistor conductances and VCVS amplification factors can take any value (arbitrarily small or large). Our findings provide insights that could guide the development of universal self-learning electrical networks. |