Classifiers

MUMT621


Introduction

  • Preprocessing
    • e.g.: segmentation, windowing, FFT, MFCC (mel-frequency cepstrum coefficients)
  • Feature extraction
    • e.g.: centroid, area, mean
  • Feature selection (feature weighting)
  • Classification
    • Training: ground-truth (separated into: training, validation, and testing datasets)
    • Validation
      • holdout method
      • k-fold cross-validation, leave-one-out (pdf)
      • Bootstrapping: resampling with replacement
    • Ensemble training
      • Bagging (Boostrsap Aggregating): Parallel training
      • Boosting: Sequential training (favour training withwrongly classified samples)

Classifiers (supervised)

  • Bayes classifier
  • Support Vector Machines
  • Boosting algorithms
  • Hidden Markov models
  • Non-parametric density estimation (distribution-free)
    • k-nearest neighbour (non-greedy, lazy)
    • Neural networks (greedy)

Clustering (unsupervised)

  • Hierachical methods
  • k-means (demo)
    • Gaussian mixture
  • Self Organizing Maps (demo)

Resources

 
Created: 2003.03.12 Modified: Ichiro Fujinaga
McGill Crest