Skip to main content

Seminar: "Modeling & Analyzing Neural Dynamics and Information Processing Over Multiple Time Scales"

Aug 17
10 a.m.
Green Hall, Room 0120

​Sensen Liu, PhD candidate, will present for partial fulfillment of the Doctor of Philosophy degree.

Abstract: Understanding the mechanisms of multiple-time-scale phenomena in neural circuits is of fundamental interest, since such dynamics are often associated with states of altered cognitive function. One such example is burst suppression, a pattern of the electroencephalogram characterized by quasi-periodic alternations of high-voltage activity (burst) and isoelectric silence (suppression). Here, following a fast-slow system framework, we develop a low-dimensional mean field model for burst suppression that attributes the phenomenology to homeostatic interactions between the neuronal activity and the supportive processes of cerebral metabolism. We demonstrate the capacity of the model to produce burst-like activity through several physiologically salient mechanisms and interpret these in the context of dynamical systems analysis. We further extend the analysis to study different mechanisms of burst synchronization over local and network-wide spatial scales. 

Following our modeling efforts, we engage in a theoretical study of how multiple time-scale dynamics may enable certain forms of information processing in neural circuits. In particular, we examine the dynamics of network plasticity, i.e. adaptation of neuronal synapses (connections between neurons) as a function of ongoing network activity. In these networks, neural responses contain a rich amount of information about past events. We consider the question of how to endow recurrent networks with learning ability so that the network can memorize and mimic certain spatio-temporal patterns over long time-scales. We approach this problem though a first-principle optimization approach by synthesizing a synaptic adaption rule that maximizes information retention within spiking networks. We show that multiple time scale dynamics of learning provide a possible mechanism by which this optimization can be performed in a local (distributed) fashion, and illustrate the functionality of the ensuing network through a number of canonical example tasks. 

Dissertation advisor: ShiNung Ching

Organizer: Shauna Dollison, sdollison@wustl.edu