CS6464 - Statistical Learning Theory

Course Data :

Objectives:

In the recent past, algorithms of solving many ill-posed problems in the field of multi-dimensional signal processing and big data analytics have gained importance. New methods of signal representation, modeling, optimization and leaning have been formulated, which spans over various areas of Machine Learning, Pattern Recognition, Vision and Natural Language Processing. This course will provide an overview of the theories and current practices, required by students and scholars who intend to specialize in this field, to solve complex problems in Machine Learning Applications for image, video, text and bioinformatics.

Syllabus:

The learning problem and regularization: The learning problem: risk functions, well-posed and ill-posed problems; Tikhonov Regularization; Iterative Regularization via Early Stopping; Manifold Regularization; Structured Sparsity Based Regularization; Empirical and Structural Risk Minimization.

Linear Models for Regression: Subset Selection methods; Shrinkage Methods: Ridge Regression, LASSO, Group LASSO, Least Angle Regression.

Regularization Networks and SVM: RKHSs, Mercer's Theorem, Representer theorem, VC Dimension, Hard & Soft margin SVMs; Multiple Kernel Learning, Risk/regret bounds for SVMs, Kernel regression, Convex losses for classification.

On-line Learning: Online classification/regression, Online learning from experts, Online convex optimization, Online-to-batch conversions.

Advanced topics: Sparse Representation Classifier; Basis Pursuit (BP), M-BP, IrM-BP, M-FOCUSS; M-SBL; Bag-of-Words & Dictionary Learning; Proximal Gradient; ADMM; Auto-encoder & Deep Learning, Transfer Learning & Domain Adaptation;

Target Applications: Face Recognition and Verification, Video event representation, CBIR in Large Scale Datasets (e.g. ImageNet), Big Data Analytics, etc.

Books:

  • V. N. Vapnik. Statistical Learning Theory. Wiley, 1998.
  • T. Evgeniou, M. Pontil and T. Poggio, “Regularization Networks and Support Vector Machines”, Advances in Computational Mathematics, 2000.

References:

  • Foundations and Trends in Machine Learning, Now Publishers Inc.,
  • Conference Proceedings of - ICML, NIPS.
  • Journal of the Royal Statistical Society: Series B (Statistical Methodology).

Pre-Requisites

    None

Parameters

Credits Type Date of Introduction
4-0-0-0-8-12 Elective Sep 2016

Previous Instances of the Course


© 2016 - All Rights Reserved - Dept of CSE, IIT Madras
Website Credits