Nov 16, 2016
Green Hall, Room 0120
MohammadMehdi Kafashan will present "Understanding the role of dynamics in brain networks: methods, theory and application."
Abstract: The brain is inherently a dynamical system whose networks interact at multiple spatial and temporal scales. Understanding the functional role of these dynamic interactions is a fundamental question in neuroscience. In this research, we approach this question through the development of new methods for characterizing brain dynamics from real data and new theories for linking dynamics to function. We perform our study at two scales: macro (at the level of brain regions) and micro (at the level of individual neurons). In the first part, we develop methods to identify the underlying dynamics at macro scale that govern brain networks during states of health and disease in humans. First, we establish an optimization framework to actively probe connections in brain networks when the underlying network dynamics are changing over time. Then, we extend this framework to develop a data-driven approach for analyzing neurophysiological recordings without active stimulation to describe spatiotemporal structure of neural activity at different timescales. We present the efficacy of this approach in characterizing spatiotemporal motifs of correlated neural activity during the transition from wakefulness to general anesthesia in functional magnetic resonance imaging (fMRI) data. Moreover, we demonstrate how such an approach can be utilized to construct an automatic classifier for detecting different levels of coma in electroencephalogram (EEG) data. In the second part, we study how ongoing function can put constraints on dynamics at micro-scale in recurrent neural networks with application to sensory systems. Specifically, we develop theoretical conditions in a linear recurrent network in the presence of both disturbance and noise for exact and stable recovery of dynamic sparse stimuli applied to the network. We show how network dynamics can affect the decoding performance in such systems. Moreover, we formulate the problem of efficient encoding of an afferent input and its history in a nonlinear recurrent network. We show that a linear neural network architecture with a thresholding activation function is emergent if we assume neurons optimize their activity based on a particular cost function, which can enable the production of lightweight, history-sensitive encoding schemes.
Organizer / Host: Shauna Dollison, (314) 935-4830