Abstract
March 6-8-13-15 2006 , 16H15 - 18H15 - Local C115
Anestis Antionadis , Université Joseph Fourier, Grenoble France
"Support Vector Machines and Statistical Learning"
The foundations of Support Vector Machines (SVM) have been developed by Vapnik
and are gaining popularity due to many attractive features, and promising
empirical performance. The formulation embodies the traditional
Empirical
Risk Minimisation (ERM) principle, and the Structural Risk Minimisation
(SRM) principle. SVM were developed to solve classification problems,
but recently they
have been extended to the domain of regression problems.
This course will be an introduction to statitical learning theory
and will then focus
on specific algorithms (linear and non linear) that have been
developed to solve
classification and regression problems, with a particular emphasis on
Support vector machines and
their kernel generalisation. Here are the specific topics that will
be addressed:
1. Basic notions in statitical learning
2. Empirical and Structural risk minimasation
3. Empirical processes, upper bounds for the risk and VC dimension
4. Support vector machines
5. Kernels and reproducing kernel hilbert spaces.
6. Kernel SVM and their applications
|