Object Tracking b Fast Re Detection From Time t to Online discriminative feature selection [Collins et al 03], Ensemble tracking [Avidan 07]... Structure of this talk Second half
Trang 1Boosting and Tree-structured g Classifier
Presenter: Nguyen Dang Binh
Trang 2The classification speed is not just for time-efficiency but good accuracy
Trang 3Object Detection
b a Cascade of Classifiers
/76 3
Pictures from Romdhani et al. ICCV01
Trang 4Object Tracking
b Fast (Re ) Detection
From Time t to
Online discriminative feature selection [Collins et al 03], Ensemble
tracking [Avidan 07]
Trang 5Semantic Segmentation
Requiring pixel‐wise classification
/76 5
Trang 6Structure of this talk
Second half
Unified Boosting framework
Trang 7Things not covered
Trang 8Introduction to Boosting Classifiers
Trang 10A brief history (continued)
Trang 13Bagging (Bootstrap AGGregatING)
Bootstrap
For each set randomly draw examples from the uniform
For each set, randomly draw examples from the uniform dist. allowing duplication and missing
Trang 14In classification: \([) = the majority class in { ,…, } \([) \7([)
Trang 15Randomized Decision Forest [Breiman 01, Geurts et al 06]]
Trang 16Randomized Tree Learning
left splitright split
Features IY chosen from a random feature pool I )
h h ld h i
Thresholds Wchosen in range
Choose I and Wto maximize gain in information
/76 16
Trang 17Random Forest – Summary
G li ti th h b i ( d l ) &
Generalization through bagging (random samples) & randomised tree learning (random features)
Very f ast classification
Very f ast classification
Trang 18 Iteratively reweighting training samples.
Higher weights to previously misclassified samples.
/76 18
1 round
2 rounds
50 rounds
Trang 19In classification: = the majority class in { ,…, }
according to the weights
Trang 20AdaBoost [Freund and Schapire 04]
Trang 21Existence of weak learners
XOR problems (Matlab demo)
/76
XOR problems (Matlab demo)
21
Trang 22Does AdaBoost generalize?
Trang 23Multiple classifier system
Trang 24Mixture of Experts [Jordan, Jacobs 94]
Gating Network
Trang 25Ensemble learning: Boosting and
Trang 26Robust real-time object detector
Trang 27Boosting Simple Features
[Viola and Jones CVPR 01]
Adaboost classification
Weak classifier
Strong classifier
W k l ifi H b i lik f ti ( )
Weak classifiers: Haar‐basis like functions (45,396 in total)
/76 27
Trang 28Boosting Simple Features
[Viola and Jones CVPR 01]
Trang 29Boosting as a Tree-structured Classifier
Trang 30Boosting (very shallow network)
Trang 31g
Trang 32B i Bagging
Tree hierarchy
Inspired by Yin, Crinimisi CVPR07
Trang 33BREAK !!
Trang 34Unified Boosting Framework
Trang 35AnyBoost : an unified framework
[Mason et al 00]
Most boosting algorithms have in common that they iteratively update sample weights and select the next hypothesis based on p p g ypthe weighted samples
Trang 38Tree-structured Classifiers
Trang 39Multi-view and multi-category object detection
Images exhibit multi modality
Trang 40Multiclass object detection [Torralba et
Trang 41Multiclass object detection [Torralba et
al PAMI 07]
/76 41
Trang 42Multiclass object detection [Torralba et
al PAMI 07]]
/76 42
Trang 43ClusterBoost [Wu et al ICCV07]
Trang 44ClusterBoost result
/76 44
Trang 45AdaTree (Grossmann CVPRW04) Result
1% Noise in data 10% Noise in data
0.5 0.4
0.4 0.3 0.2
0.2 0.1
/76 45
Mean computation cost
0
0 90
Trang 46Boosting for XOR
- MCBoost [T-K Kim, R Cipolla NIPS 08]
Trang 47Face cluster 1
/76 47
K‐means clustering
Face cluster 2 MCBoost
ICCV09 Tutorial Tae-Kyun Kim University of Cambridge
Trang 48MCBoost: Multiple Strong Classifier
Trang 49Toy XOR classification problem
Discriminative clustering positive samples
Discriminative clustering positive samples
It doesn’t partition an input space
/76 49
Trang 50Toy XOR classification problem
classifier1 classifier2 classifier3
1.2 1.1
1.0 0.9
0.5 0.4
0 3
Boosting round
10 20 30
0.3 0.2
/76
Matlab demo
50
Trang 52Pedestrian detection by MCBoost
[Wojek, Walk, Schiele et al CVPR09]
Trang 53Pedestrian detection by MCBoost [Wojek, Walk, Schiele et al CVPR09]
Trang 54/76 54
Trang 55Speeding up
Trang 56A sequential structure of varying
Classify x as ‐ and exit
Trang 57Making a shallow network deep
- Super tree [Kim, Budvytis, Cipolla, 09]
Y
Y
Trang 58Converting a boosting classifier to a decision tree
Many short paths for speeding up
Preserving (smooth) decision
regions for good generalisation
/76 58
Trang 59Converting a boosting classifier to a decision tree
Many short paths for speeding up
8 2
7
13 2
4
5 times speed up
/76 59
Boosting
Super tree
Trang 60/76 60
Trang 61Boolean optimisation formulation
0 1
0 1
W3
R8 1 1 1 x
Trang 62Boolean expression minimization
0 1
1 1
R4
R3
/76 62
W1W2W3 v W1W2W3 v W1W2W3 vW1W2W3 W1 v W1W2W3
Trang 63Boolean optimisation formulation
Trang 64Synthetic data exp1
Examples generated
from GMMs
/76 64
Trang 65Face detection experiment
False negativer ate
Average path length
False positive rate
False negativer ate
Average path length
False positiv
e rate
False negative rate
Average path length
/76 65
Trang 66ST vs Random forest
/76
ST is about 2 to 3 times faster than RF at similar accuracy
66
Trang 67Experiments with tracking and
segmentation by ST g y
/76 67
Trang 68More segmentation experiments by Boosting and RF
Trang 69MCBoost building class segmentation
Building class: average class accuracy = 77%
/76
Ground Truth Segmentation Ground Truth Segmentation
Trang 70Building Non‐building Error Road Non‐road Error
global : 74.50%
average:
79.89%
global: 88.33%, average: 87.26%
80.45%
85.55 %, average: 85.24 %
/76 70
Trang 71Boosting for Segmentation g g
Trang 72Conclusions
Trang 74B i Bagging
Tree hierarchy
Inspired by Yin, Crinimisi CVPR07
Trang 75• Slower than RF
Stumps • Fast classification
• Theoretical backgrounds
• Slow training
/76 75
Trang 77Th k
/76
Thanks
77
... class="page_container" data-page="29">Boosting as a Tree- structured Classifier< /h3>
Trang 30Boosting. .. class="page_container" data-page="25">
Ensemble learning: Boosting and
Trang 26Robust real-time... data-page="17">
Random Forest – Summary
G li ti th h b i ( d l ) &
Generalization through bagging (random samples) & randomised? ?tree? ?learning (random features)