Statistical Modeling and Analysis of Neural Data

NEU 560 (Fall 2020), Princeton University
time: Tu/Th 10:00-11:20
location: cyberspace
instructor: Jonathan Pillow
AI: Orren Karniol-Tambour (Office hours: Tuesday 9am-10am)
prerequisites: A good working knowledge of calculus, linear algebra, and basic probability / statistics. Familiarity with python is also desirable, as homework assignments and final projects will involve programming. No prior experience with neural data is required.
brief description: This course aims to introduce students to methods for modeling and analyzing neural datasets, with an emphasis on statistical approaches to the problem of information processing in neural populations. A tentative list of topics includes: neural encoding models, Poisson processes, generalized linear models, logistic regression, Gaussian processes, latent variable models, factor analysis, mixture models, EM, Kalman filter, latent dynamical models of neural activity. The course is aimed at students from quantitative backgrounds (neuroscience, engineering, math, physics, computer science, statistics, psychology) who are interested in the modeling and analysis of neural data.
syllabus: pdf
piazza site: link
Lecture Schedule

DateTopic Readings Slides Notes HW
Tu 9.01 Course Intro spikes_intro.pdf slides01.pdf
Th 9.03 Linear algebra review linalg handouts:
E. Simoncelli handout (pdf)
M. Jordan chapter (pdf)
slides02.pdf intro.ipynb
Tu 9.08 Singular Value Decomposition wb03.pdf notes03.pdf hw0.ipynb, hw0_ans.ipynb
Th 9.10 SVD applications slides04.pdf notes04.pdf
Tu 9.15 PCA slides05_linsys.pdf, wb05.pdf notes05_PCA.pdf hw1.colab
due: Weds 9/30
Th 9.17 PCA continued slides06_PCA2.pdf
Tu 9.22 Least-squares regression Kaufman et al, 2014 slides07.pdf, wb07.pdf notes07_LeastSquares.pdf
Th 9.24 Probability review Bishop (PRML): Sec. 1.2 slides08_Probability.pdf
Tu 9.29 Information theory slides09a_Probability2.pdf slides09b_InfoTheory.pdf
Th 10.01 Encoding models & maximum likelihood slides10_EncodingModels.pdf
Tu 10.06 Generalized linear models (GLMs) slides11_GLMs.pdf
Th 10.08 GLMs II (spike-history, multi-neuron models) slides12_GLMs2.pdf hw2.colab
due: Weds 10/21
Tu 10.13 fall recess
Th 10.15 MAP inference & regularization wb13_MAP.pdf notes13_MAP.pdf
Tu 10.20 MAP inference 2 wb14_MAP2.pdf notes14_MAP2.pdf
Th 10.22 Model selection: cross-validation and evidence optimization wb15_XV.pdf notes15_XV.pdf hw3.colab (GLMs)
due: Fri 11/06
Tu 10.27 Empirical Bayes & Laplace Approximation slides16_EB.pdf, wb16_Laplace.pdf ProjectIdeas.pdf
due: Mon 11/16
Th 10.29 Laplace Approximation 2 slides17_Laplace2.pdf
Tu 11.03 Latent variable models wb18_LVMs.pdf notes18_LVMs.pdf
Th 11.05 EM algorithm slides19_warmup.pdf, wb19_EM.pdf
Tu 11.10 Mixtures of Gaussians slides20_EM.pdf, wb20_MoG.pdf notes20_MoG.pdf
Th 11.12 Factor Analysis wb21_kmeans_FA.pdf
Tu 11.17 FA, PPCA, & Variational Autoencoders (VAE) slides22_FA_VAE1.pdf notes22_FA_PPCA.pdf
Th 11.19 VAEs II slides23_VAEs2.pdf hw4.colab
due: Dean's Date 12/08
Tu 11.24 HMMs & Latent Linear Dynamical Systems Bishop (PRML): Chap. 13 wb24_LDSandHMM.pdf
TBA Course Project presentations
 

page maintained by Jonathan Pillow