The Neural Information Dynamics of Predictive Coding from Synapses to Systems

25.01.2017 - 17:00
25.01.2017 - 19:00
Lecturer

Prof. Dr. Michael Wibral

Brain Imaging Center, Goethe University, Frankfurt

Information theoretic quantities separate and measure key elements of distributed computation in neural systems, such as the storage, transfer, and modification of information. This way, they help to better understand the computational algorithms implemented in a neural system. This understanding cannot be reached by detailed biophysical modeling alone, as already pointed out by David Marr in his classic tri-level hypothesis. In other words, we may biophysically well understand why the brain signals look the way they do, but not what these signals are worth in terms of information processing proper. This shortcoming is particularly severe when trying to understand whether and how the brain performs predictive coding, and forces us to rely on very strong prior assumptions on how the brain should work (the very thing we try to find out).

Indeed, the missing link between neural dynamics at the biophysical level and the computational algorithms implemented by these dynamics can be provided by information theoretic methods, especially the analysis of (local) transfer entropy and (local) active information storage. I will demonstrate how to use these measures to distinguish between neurons coding for prediction errors, or for predictable inputs. This will be done based on the neural dynamics alone, without invoking semantics imposed by the experimenter - in other words, without imposing a priori what the brain should predict.