Surya Ganguli
Assistant Professor of Applied Physics and, By Courtesy, of Neurobiology and of Electrical EngineeringResearch areas:
Biophysics, Condensed Matter, Electrical Engineering, Information Sci/Tech, Statistical Physics
Description
Biophysics
Our lab works on theoretical neuroscience, with the fundamental goal of understanding how networks of neurons and synapses cooperate across multiple scales of space and time to mediate important brain functions, like sensory perception, motor control, and memory. To achieve this goal, we employ and extend tools from disciplines like statistical mechanics, dynamical systems theory, machine learning, information theory, control theory, and high-dimensional statistics, as well as collaborate with experimental neuroscience laboratories collecting physiological data from a range of model organisms. Some topics of interest include: how birds learn to sing, spatial memory in the rodent hippocampus, attention and motor control in macaques, memory properties of complex synapses, dynamics of plasticity in recurrent networks, signal propagation in neural circuits, the emergence of categorization in multi-layered networks, and the statistical mechanics of high dimensional data analysis.
Condensed Matter Physics
We employ techniques from statistical mechanics, like replica theory and random matrix theory, to analyze the complex dynamics of learning, signal propagation and memory in neuronal networks. We also have an interest in exploiting statistical mechanics to analyze the performance of algorithms from machine learning which could be implemented in neuronal architectures.
Courses Taught
Selected Publications
- On simplicity and complexity in the brave new world of large-scale neuroscience
- Environmental boundaries as an error correction mechanism for grid cells
- Deep unsupervised learning using non-equilibrium thermodynamics
- Evidence for a causal inverse model in an avian cortico-basal ganglia circuit
- A memory frontier for complex synapses
- Learning hierarchical category structure in deep neural networks
- A Hebbian learning rule gives rise to mirror neurons and links them to control theoretic inverse models
- Statistical mechanics of complex neural systems and high dimensional data
- Compressed sensing, sparsity and dimensionality in neuronal information processing and data analysis
- Spatial information outflow from the hippocampal circuit: distributed spatial coding and phase precession in the subiculum
- Holographic Protection of Chronology in Universes of the Godel Type
- Twisted Six Dimensional Gauge Theories, Matrix Models and Integrable Systems
- E10 Orbifolds
- Function Constrains Network Architecture and Dynamics: A Case Study on the Yeast Cell Cycle Network
- One Dimensional Dynamics of Attention and Decision Making in LIP
- Memory Traces in Dynamical Systems
- Feedforward to the past: the relation between neuronal connectivity, amplification, and short-term memory
- Statistical Mechanics of Compressed Sensing
- Short-term memory in neuronal networks through dynamical compressed sensing