| Title | Boosting shift-invariant features | 
| Publication Type | Conference Paper | 
| Year of Publication | 2009 | 
| Authors | Hörnlein, T, Jähne, B, Süße, H | 
| Editor | Denzler, J, Notni, G | 
| Conference Name | Pattern Recognition | 
| Publisher | Springer | 
| Abstract | This work presents a novel method for training shift-invariant features using a Boosting framework. Features performing local convolutions followed by subsampling are used to achieve shift-invariance. Other systems using this type of features, e.g. Convolutional Neural Networks, use complex feed-forward networks with multiple layers. In contrast, the proposed system adds features one at a time using smoothing spline base classifiers. Feature training optimizes base classifier costs. Boosting sample-reweighting ensures features to be both descriptive and independent. Our system has a lower number of design parameters as comparable systems, so adapting the system to new problems is simple. Also, the stage-wise training makes it very scalable. Experimental results show the competitiveness of our approach.  |  
| DOI | 10.1007/978-3-642-03798-6_13 | 
| Series | Lecture Notes in Computer Science  |  
| Citation Key | hoernlein2009 | 


