Training data recycling for multi-level learning

Jingchen Liu, Scott McCloskey, Yanxi Liu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations


Among ensemble learning methods, stacking with a meta-level classifier is frequently adopted to fuse the output of multiple base-level classifiers and generate a final score. Labeled data is usually split for basetraining and meta-training, so that the meta-level learning is not impacted by over-fitting of base level classifiers on their training data. We propose a novel knowledge-transfer framework that reutilizes the basetraining data for learning the meta-level classifier without such negative consequences. By recycling the knowledge obtained during the base-classifier-training stage, we make the most efficient use of all available information and achieve better fusion, thus a better overall performance. With extensive experiments on complicated video event detection, where training data is scarce, we demonstrate the improved performance of our framework over other alternatives.

Original languageEnglish (US)
Title of host publicationICPR 2012 - 21st International Conference on Pattern Recognition
Number of pages5
StatePublished - 2012
Event21st International Conference on Pattern Recognition, ICPR 2012 - Tsukuba, Japan
Duration: Nov 11 2012Nov 15 2012

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651


Other21st International Conference on Pattern Recognition, ICPR 2012

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition


Dive into the research topics of 'Training data recycling for multi-level learning'. Together they form a unique fingerprint.

Cite this