Boosting shift-invariant features

TitleBoosting shift-invariant features
Publication TypeConference Paper
Year of Publication2009
AuthorsHörnlein, T, Jähne, B, Süße, H
EditorDenzler, J, Notni, G
Conference NamePattern Recognition
PublisherSpringer
Abstract

This work presents a novel method for training shift-invariant features using a Boosting framework. Features performing local convolutions followed by subsampling are used to achieve shift-invariance. Other systems using this type of features, e.g. Convolutional Neural Networks, use complex feed-forward networks with multiple layers. In contrast, the proposed system adds features one at a time using smoothing spline base classifiers. Feature training optimizes base classifier costs. Boosting sample-reweighting ensures features to be both descriptive and independent. Our system has a lower number of design parameters as comparable systems, so adapting the system to new problems is simple. Also, the stage-wise training makes it very scalable. Experimental results show the competitiveness of our approach.

DOI10.1007/978-3-642-03798-6_13
Series

Lecture Notes in Computer Science

Citation Keyhoernlein2009