Advanced Pattern Recognition Course

The 5-day course on Advanced Pattern Recognition gives a thorough understanding of the building blocks that may be used for designing and training recognition systems. Various techniques such as classification, feature selection, neural networks and support vector machines will be discussed in-depth. Pitfalls, e.g. overtraining, are explained and it is shown how they can be avoided. Recently developed techniques are introduced, such as combining classifiers and dissimilarity based representations. They enable the reliable construction of complicated recognition schemes and offer solutions in applications for which no natural features can be found. Applications in image recognition and segmentation are used as an illustration.

There are two lectures a day (each of about 75 minutes) accompanied by intensive laboratory exercises giving the participants a broad opportunity to inspect and evaluate the algorithms themselves. Use is made of the Matlab toolbox PRTools as developed by the course organizers.

Daily schedule

  • 8:30 – 17:30 Laboratory class room is open
  • 9:30 – 10:45 Lecture
  • 10:30 – 12:30 Laboratory exercises and experiments
  • 12:30 – 13:30 Lunch
  • 13:30 – 14:30 Laboratory exercises and experiments
  • 14:30 – 15:45 Lecture
  • 15:30 – 17:30 Laboratory exercises and experiments

The first day of the course starts with registration at 9:00 and a lecture at 9:30. The final day of the course ends at 17:00.

Lecture program (preliminary)

day 1 Introduction and Recapitulation Classification
morning Introduction to Statistical Pattern Recognition: All elements in design, training, evaluation and application of a PR system are briefly introduced.
afternoon Classification, Discriminant Analysis: The principles underlying the various types of classifiers: density estimation, Bayes Discriminant, Fisher Discriminant, error minimization, nearest neighbour rule.


day 2 Evaluation and Representation
morning Classifier Evaluation and Error Estimation: How to obtain reliable and accurate error estimates The use of separate sets for training and testing. The apparent error and the generalization error. Rotation and leave-one-out methods. Learning curves; the use of costs and rejects, ROC curves.
afternoon Representations: Various ways objects can be represented in vector spaces: features, pixels, dissimilarities, class confidences. Examples focussing on image patterns.
day 3 Clusters, Images and Features
morning Clustering and Image Segmentation: The problem of unsupervised training. Non-Linear mapping. Multi-dimensional scaling. Hierarchical clustering, K-Means algorithm. Iterative fitting of a mixture of Gaussians by the Expectation-Maximization (EM) algorithm. Applications on image segmentation: texture images, 3-color images, multi-band images.
afternoon Feature Reduction: Reasons behind feature reduction. Feature extraction and feature selection.Linear and non-linear procedures: PCA, non-linear PCA, Fisher mapping. The use of subspaces.


day 4 Complexity, SVMs, Neural Nets and Combining
morning Classifier Complexity and Support Vector Machines: The pitfall of using complex procedures for training. Adaptation to noise and overtraining. Possible solutions: simplification, regularisation, structural risk minimization by support vector machines.
afternoon Neural Networks and Combining Classifiers: The principle behind neural network classifiers. From linear to non-linear classifiers. Relation with combining classifiers. Fixed and trainable combining rules. Adaboost.


day 5 One-Class Classifiers, Active Learning and Discussion
morning One-Class Classifiers and Active Learning: Novelty detection and the concept of one-class classifiers. How to define and train them. Active learning: minimization of the object labelling effort in case it is expensive.
afternoon Final experiments and discussion: Final experiments combining various pattern recognition techniques. Concluding discussion.