Statistical Modeling and Analysis of Neural Data

NEU 560 (Spring 2018), Princeton University
time: Tu/Th 11:00-12:20
location: PNI A30
instructor: Jonathan Pillow (Office hours: Tues 12:30-1:30p, PNI 254)
AI: Mike Morais (Office hours: Fri 1:30-3:30, PNI A59)
prerequisites: A good working knowledge of calculus, linear algebra, and basic probability / statistics. Familiarity with python is also desirable, as homework assignments and final projects will involve programming. No prior experience with neural data is required.
brief description: This course aims to introduce students to methods for modeling and analyzing neural datasets, with an emphasis on statistical approaches to the problem of information processing in neural populations. A tentative list of topics includes: neural encoding models, Poisson processes, generalized linear models, logistic regression, Gaussian processes, latent variable models, factor analysis, mixture models, EM, Kalman filter, latent dynamical models of neural activity. The course is aimed at students from quantitative backgrounds (neuroscience, engineering, math, physics, computer science, statistics, psychology) who are interested in the modeling and analysis of neural data.
syllabus: pdf
piazza site: link
Lecture Schedule

DateTopic Reading Slides / Notes HW
Tu 2.06 Course Intro spikes_intro.pdf slides01.pdf hw0.zip
Th 2.08 Linear algebra review, SVD useful handouts:
E. Simoncelli handout (pdf)
M. Jordan chapter (pdf)
slides02.pdf, notes02.pdf
Tu 2.13 SVD applications, least squares regression slides03a.pdf, notes03b.pdf
Th 2.15 Low rank matrices, determinant, PCA slides04.pdf
Tu 2.20 more PCA slides05.pdf hw1.zip
due: weds 3/7
Th 2.22 Probability review Kaufman et al, 2014 slides_Kaufman14.pdf, slides06.pdf
Tu 2.27 Neural encoding models & Maximum Likelihood slides07.pdf
Th 3.01 Information theory (Mike Morais) notes08_infotheory.pdf, slides08_efficientcoding.pdf
Tu 3.06 no class
Th 3.08 Generalized linear models slides09.pdf
Tu 3.13 GLMs for multi-neuron data slides10.pdf hw2.zip
(due: Weds 3/28)
Th 3.15 MAP inference & regularization notes11.pdf
Tu 3.20 spring break
Th 3.22 spring break
Tu 3.27 Gaussian processes (Stephen Keeley) notes12.pdf
Th 3.29 Primal and dual space views of regression notes13.pdf
Tu 4.03 Gaussian processes (function space view) example code: testGPs1.m, testGPs2.m, testGPs.ipynb notes14.pdf
Th 4.05 Cross-validation, evidence optimization, & Laplace approximation notes15.pdf hw3.zip
(due: Mon 4/16)
Tu 4.10 Latent variable models and EM notes16.pdf
Th 4.12 Mixtures of Gaussians, K-means example code: testEM_MoG.zip (matlab), testEM_MoG.ipynb (python) notes17.pdf, EMslides.pdf
Tu 4.17 Factor analysis notes18.pdf
Th 4.19 FA and Probabilistic PCA notes19.pdf
Tu 4.24 EM for Factor Analysis
Th 4.26 Variational Autoencoders (VAE) slides21.pdf hw4.zip
(due: Mon 5/7)
Tu 5.01 Kalman Filter & Latent Linear Dynamical System (LDS) models
Th 5.03 Hidden Markov Models (HMMs)
Th 5.10 Course Project presentations
 

page maintained by Jonathan Pillow