Unveiling the role of plasticity rules in reservoir computing. (arXiv:2101.05848v1 [nlin.AO])

Reservoir Computing (RC) is an appealing approach in Machine Learning that
combines the high computational capabilities of Recurrent Neural Networks with
a fast and easy training method. Likewise, successful implementation of
neuro-inspired plasticity rules into RC artificial networks has boosted the
performance of the original models. In this manuscript, we analyze the role
that plasticity rules play on the changes that lead to a better performance of
RC. To this end, we implement synaptic and non-synaptic plasticity rules in a
paradigmatic example of RC model: the Echo State Network. Testing on nonlinear
time series prediction tasks, we show evidence that improved performance in all
plastic models are linked to a decrease of the pair-wise correlations in the
reservoir, as well as a significant increase of individual neurons ability to
separate similar inputs in their activity space. Here we provide new insights
on this observed improvement through the study of different stages on the
plastic learning. From the perspective of the reservoir dynamics, optimal
performance is found to occur close to the so-called edge of instability. Our
results also show that it is possible to combine different forms of plasticity
(namely synaptic and non-synaptic rules) to further improve the performance on
prediction tasks, obtaining better results than those achieved with
single-plasticity models.

Source: https://arxiv.org/abs/2101.05848


Related post