Reservoir Computing
March 28, 2022 — March 28, 2022
I am familiar with reservoir computing only from random NNs, but there is a whole weird literature on training reservoir models that lives in a weird mirror world to classic gradient-descent-trained predictive models. It seems to lean more heavily on the statistical mechanics of statistics than I am accustomed to.
Gauthier et al. (2021) introduces modern training methods for these things: Scientists develop the next generation of reservoir computing.
Kim and Bassett (2022) shows how to program them:
we turn the analogy between neural computation and silicon computation into a concrete reality by programming fundamental constructs from computer science into reservoir computers. First, we extend the idea of static memory in silicon computers to program chaotic dynamical systems as random access memories (dRAM). Second, because RCs can store dynamical systems as memories, and the RC itself is a dynamical system, we demonstrate that a host RC can virtualize the time-evolution of a guest RC, precisely as a host silicon computer can create a virtual machine of a guest computer. Third, we provide a concrete implementation of a fully neural logical calculus by programming RCs to evolve as the logic gates and, nand, or, nor, xor, and xnor, and construct neural implementations of logic circuits such as a binary adder, flip-flop latch, and multivibrator circuit. Finally, we define a simple scheme for software and game development on RC architectures by programming an RC to simulate a variant of the game “pong.” Through this language, we define a concrete, practical, and fully generalizable implementation of neural computation