Photo: Gerd Altmann from Pixabay |
Modern deep learning is at the very least biologically-inspired, encoding information in the strength of connections between large networks of individual computing units known as neurons. Probably the biggest difference, though, is the way these neurons communicate with each other.
Artificial neural networks are organized into layers, with each neuron typically connected to every neuron in the next layer. Information passes between layers in a highly synchronized fashion as numbers falling in a range that determines the strength of the connection between pairs of neurons...
In a paper in Nature Communications, the Austrian team describes how they created
artificial analogues of these two features to create a new learning
paradigm they call e-prop. While the approach learns slower than
backpropagation-based methods, it achieves comparable performance.
Source: Singularity Hub