Statistical Modeling and Analysis of Neural Data

NEU 560, Princeton University (Spring 2024)
time: Tu/Th 10:00-11:20
location: PNI A02
instructor: Jonathan Pillow
(office hours: Tues 11:30-12:30 & by appt, PNI 254)
AI: Victor Geadah
(office hours: Thur 3-4pm & by appt, PNI 232D)
prerequisites: A good working knowledge of calculus, linear algebra, and basic probability / statistics. Familiarity with python is also desirable, as homework assignments and final projects will involve programming. No prior experience with neural data is required.
brief description: This course aims to introduce students to methods for modeling and analyzing neural datasets, with an emphasis on statistical approaches to the problem of information processing in neural populations. A tentative list of topics includes: neural encoding models, Poisson processes, generalized linear models, logistic regression, Gaussian processes, latent variable models, factor analysis, mixture models, EM, Kalman filter, VAEs, latent dynamical models of neural activity. The course is aimed at students from quantitative backgrounds (neuroscience, engineering, math, physics, computer science, statistics, psychology) who are interested in the modeling and analysis of neural data.
syllabus: pdf
Ed discussion: link
Lecture Schedule

DateTopic Readings Slides Notes HW
Tu 1.30 Course Intro spikes_intro.pdf slides01.pdf

Linear algebra review:
slides01b.pdf
linear algebra handouts:
Simoncelli handout (pdf)
M. Jordan chapter (pdf)
linear algebra review: slides & online lecture
Th 2.01 Singular Value Decomposition (SVD) slides02.pdf notes02_SVD.pdf check out:
colab-intro.ipynb
Tu 2.06 Principal Components Analysis (PCA) slides03_PCA.pdf notes03_PCA.pdf
Th 2.08 PCA II & Least-squares slides04_LeastSquares.pdf notes04_LeastSquares.pdf hw0.colab, hw0_ans.colab
Tu 2.13 Neuroscience applications (PCA & least squares) Kaufman et al, 2014
Musall et al, 2019
slides05_NeuroApplications.pdf hw1.colab
due: Fri 2/23
Th 2.15 Probability review slides06_Probability.pdf
Tu 2.20 Statistics, Entropy Bishop book: Sec. 1.2 slides07_Entropy.pdf
Th 2.22 Information Theory slides08_InfoTheory.pdf
Tu 2.27 Maximum Likelihood & Generalized linear models (GLMs) slides09_MLandGLMs.pdf hw2.colab
due: Fri 3/8
Th 2.29 GLMs II slides10_GLMs2.pdf
Tu 3.05 no class (Cosyne).
Th 3.07 Regularization slides11_Regularization.pdf notes11_Regularization.pdf
Tu 3.12 spring break
Th 3.14 spring break
Tu 3.19 MAP inference slides12_MAP.pdf notes12_MAP.pdf hw3.colab
due: 3/29
Th 3.21 Model selection (Cross-validation, Evidence optimization), Empirical Bayes slides13_ModelSelection.pdf notes13_ModelSelect.pdf
Tu 3.26 Laplace Approximation slides14_LaplaceApprox.pdf ProjectIdeas.pdf
due: Tues 4/09
Th 3.28 Latent Variable Models slides15_LVMs.pdf notes15_LVMs.pdf
Tu 4.02 Expecation Maximization (EM) slides16_EM.pdf notes16_EM.pdf
Th 4.04 virtual lecture: EM for Gaussian Mixture Models example code: testEM_MoG.zip (matlab) testEM_MoG.ipynb (python) slides17_MoG.pdf notes17_MoG.pdf
Tu 4.09 Monte Carlo integration, importance sampling slides18_MCandIS.pdf
Th 4.11 Factor Analysis & PPCA slides19_FA.pdf notes19_FA_PPCA.pdf
Tu 4.16 Variational Auto-Encoders (VAEs) slides20_VAEs.pdf hw4.colab
due: Thurs 4/25
Th 4.18 Hidden Markov Models (HMMs) Bishop book: Chap. 13.2 slides21_HMMs.pdf
Tu 4.23 Latent Linear Dynamical Systems (LDS), Kalman Filter-Smoother Bishop book: Chap. 13.3 slides22_LDS.pdf
Th 4.25 Survey of Neural Latent Time Series Models slides23_NeuroLVMs.pdf
TBA Course Project Presentations
 

page maintained by Jonathan Pillow