Probabilistic Machine Learning Methods for Neural Time Series Analysis - Invited Speaker: Dr. Scott Linderman

#Machine #Learning #Artificial #Intelligence #Dynamical #Systems
Share

Modern recording technologies provide unprecedented access to the brain, offering large-scale measurements of neural population activity. These technologies promise unique opportunities for neuroscientists, but they also present extraordinary statistical and computational challenges: how do we formulate models that capture the nonlinear dynamics of neural activity; how do we fit these models at scale; and how do we assess our modeling assumptions in light of data? I will discuss my ongoing efforts to tackle these challenges with switching linear dynamical systems (SLDS), statistical models that achieve nonlinear dynamics by composing simple, linear components.  Specifically, I will introduce the "recurrent" SLDS, which enables discrete state transitions to depend on exogenous inputs and internal latent states, and I will describe novel methods of Bayesian learning and inference for this model.  Through applications to whole-brain recordings of C. Elegans, I will show how the recurrent SLDS can reveal interpretable structure in complex neural data.

 

 



  Date and Time

  Location

  Hosts

  Registration



  • Date: 07 Aug 2017
  • Time: 12:00 PM to 01:00 PM
  • All times are (GMT-05:00) US/Eastern
  • Add_To_Calendar_icon Add Event to Calendar
  • Griffiss Institute
  • 725 Daedalian Drive
  • Rome, New York
  • United States 13441
  • Room Number: Executive Board Room

  • Contact Event Host


  Speakers

Dr. Scott Linderman

Topic:

Probabilistic Machine Learning Methods for Neural Time Series Analysis

 

Biography:

Dr. Scott Linderman is a Postdoctoral Fellow at Columbia University in the Departments of Computer Science and Statistics and the Grossman Center for the Statistics of Mind. He works at the intersection of machine learning and neuroscience, and he is advised by Professors Liam Paninski and David Blei. He completed his Ph.D. in Computer Science at Harvard University in 2016 under the supervision of Professors Ryan Adams and Leslie Valiant, and he received his B.S. in Electrical and Computer Engineering from Cornell University in 2008. He is a recipient of the Simons Collaboration on the Global Brain Postdoctoral Fellowship, the Siebel Scholarship, and the National Defense Science and Engineering Graduate Fellowship.  He is a finalist for the 2016 Leonard J. Savage Outstanding Dissertation Award from the International Society for Bayesian Analysis, and his work has been recognized with a Best Paper Award at the International Conference on Artificial Intelligence and Statistics.  Prior to graduate school, he worked at Microsoft as a software engineer on the Windows networking stack, and before that he interned at the Air Force Research Laboratory in Rome, NY for five summers.  

Dr. Scott Linderman

Topic:

Probabilistic Machine Learning Methods for Neural Time Series Analysis

Biography: