Advanced Statistical Machine learning 495

WARNING: This section contains links to pdf files that may be covered by copyright. You may browse them at your convenience in the same spirit as you may read a journal or a proceeding article in a public library. Retrieving, copying, or distributing these files, however, may violate the copyright protection law. We recommend that the user abides international law in accessing this directory.

Advanced Statistical Machine Learning (course 495) is envisioned to be a Master's level course for several groups of students including MSc Advanced Computing students, MSc in Computing (Specializations, e.g., on Machine Learning and Visual Information Processing), and 4th year MEng in Computing and Joint Mathematics and Computing students.

 

Course aim:

  1. to provide the students with the necessary theoretical and computational skills to understand, design and implement modern statistical latent variable models, including statistical component analysis (e.g., LDA, PCA, ICA etc), as well as, probabilistic models such as Probabilistic PCA, Kalman filter, Hidden Markov Models, Particle filters and Markov Random Fields (MRFs).
  2. the models that will be studied are the following:
    • Deterministic Component Analysis: Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA), Independent Component Analysis (ICA), Graph-based Component Analysis etc
    • Static Probabilistic Component Analysis: Probabilistic Principal Component Analysis
    • Linear Dynamical Systems: Kalman & particle filters, Hidden Markov Models  
    • Markov Random Fields
  3. students in the end of the course should be able to formulate latent variable model in a probabilistic manner, device algorithms for learning the parameters of the model and performing inference.

Course material:

  • Book: Pattern Recognition & Machine Learning by Christopher M. Bishop , Springer, 2006 (chapters: 1,2,8,9,11,12,13).
  • Notes on Component Analysis (under preparation).
  • Notes on Markov Random Fields (under preparation).
  • Lecture Slides

Course schedule:

the course will run for 9 weeks in total comprising of both Lectures and Tutorials (both by Dr. Stefanos Zafeiriou). Assessment: 90% from written exams and 10% from assignments (2 assignments in total).

  • One will be given next week and should be delivered by 21st of February 2014.
  • The second will be given 24th of February and should be delivered 17th of March 2014.

Course Lectures:

  • Lecture  1-2   : Introduction (pdf file)
  • Tutorial   1      : A primer on probability/statistics (pdf file).
  • Tutorial   2      : A primer on linear algebra (pdf file)
  • Tutorial   3      : A primer on Optimization (pdf file)
  • Tutorial 4     : Gram-Schmidt Orthogonalization (pdf file)
  • Lecture 3-5    : Component Analysis: Basic Concepts, Principal Component Analysis, Linear Discriminant Analysis (pdf file), Locality Preserving Component Analysis(pdf file)
  • Tutorial 5-6        Computing Component Analysis: Trace Optimization Problems, Generalized Eigenanalysis and Covariance Matrices (1) (pdf file)
  • Lecture 5-6    : Independent Component Analysis(pdf file)
  • Lecture 7    : Non-linear Component Analysis(pdf file)
  • Lecture 8-9      :Introduction to EM and EM for Mixture Models (pdf file)
  • Lecture 9-10    : Probabilistic Principal Component Analysis (pdf file)
  • Tutorial 7    : Mixtures of PPCA (pdf file)
  • Tutorial 8    : PPCA with Missing Data (pdf file)
  • Lecture 10-11  : Markov Models (pdf file)
  • Lecture 12-13   : Hidden Markov Models, filtering, smoothing (pdf file) and parameter estimation (pdf file)
  • Tutorial 9-10   : EM in Hidden Markov Models and Viterbi algorithm (pdf file)
  • Lecture 14: Kalman Filters (pdf file)
  • Tutorial 11: EM in Kalman Filters (pdf file)

Assignments:

  • Individual Assignment 1 (to be delivered by 20th of February) (files)
  • Individual Assignment 2 (to be delivered by 17th of March)

Notes:

  • Introductory Notes (files)
  • Notes on Deterministic Component Analysis (files)
  • Notes on Expectations Maximisation and Gaussian Mixture Models
  • Notes on Completing the Square (files)
  • Notes on Probabilistic PCA
  • Notes on Hidden Markov Models (pdf file)
  • Notes on Kalman Filters
  • Notes on Markov Random Fields (MRF) and Gaussian MRFs

Contact:

For inquiries please use course495imperial@gmail.com. Office hours 16:00-17:00 every Monday (office 423 Huxley Building).

Further Reading: Tutorials

These are three modules from www.coursera.com that we recommend you to have a look at for specific revision or knowledge enhancement (detailed below):
1) Machine Learning by Andrew Ng (ML)
2) Introduction to Computational Finance and Financial Econometrics (CF)
3) Probabilistic Graphical Models (PGM)
Matrix Review:
1) section Linear Algebra Review from ML (all video lectures)
2) section Matrix Algebra from CF  (all video lectures)
Octave Tutorial (similar to MATLAB)
1) section Octave Tutorial from ML  (all video lectures)
2) section ML-class Octave Tutorial from PGM  (all video lectures)
Overfitting
1) section Regularization from ML (only video The problem of overfitting)
Precision, Recall rates and F1-measure
1) section Machine Learning Design System from ML (all video lectures) 
Neural Networks
1) section Neural Networks: Representation from ML  (all video lectures)
2) section Neural Networks: Learning from ML  (all video lectures)
Probability Review
1) section Probability Review from CF (all video lectures except for Portfolio example)
You are strongly advised to look at them

These are three modules from www.coursera.com that we recommend you to have a look at for specific revision or knowledge enhancement (detailed below):

  • Machine Learning by Andrew Ng (ML)
  • Introduction to Computational Finance and Financial Econometrics (CF)
  • Probabilistic Graphical Models (PGM)

 

Matrix Review:

  • section Linear Algebra Review from ML (all video lectures)
  • section Matrix Algebra from CF  (all video lectures)

 

Octave Tutorial (similar to MATLAB):

  • section Octave Tutorial from ML  (all video lectures)
  • section ML-class Octave Tutorial from PGM  (all video lectures)

 

Overfitting:

  • section Regularization from ML (only video The problem of overfitting)

 

Probability Review

  • section Probability Review from CF (all video lectures except for Portfolio example)

You are strongly advised to look at them.

 

Subspace Learning

Dimensionality Reduction (papertoolkit, survey), PCA (paper), LDA (paper 1, tutorial 2), KPCA and KDA (paper 1, paper 2), Manifold Learning (Local Linear Embeddings paper and Laplacian Eigenmaps paper).  On Probabilistic Component Analysis, Chapter 12 of "Pattern Recognition and Machine Learning" by Christopher M. Bishop.

 

Design of Classifiers 

Support Vector Machines (Tutorial 1Turorial 2)


Bayesian Learning and Graphical Models

Download tutorials on Dynamic Bayesian Networks (paper), Hidden Markov Models (tutorial), Relevance Vector Machines (tutorial), Gaussian Processes (bookvideolecture 1videolecture 2), Conditional Random Fields (tutorialvideolecture)