LSTM Self-Supervision for Detailed Behavior Analysis

TitleLSTM Self-Supervision for Detailed Behavior Analysis
Publication TypeConference Paper
Year of Publication2017
AuthorsBrattoli, B, Büchler, U, Wahl, A-S, Schwab, ME, Ommer, B
Conference NameProceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
Publisher(BB and UB contributed equally)
Abstract

Behavior analysis provides a crucial non-invasive and easily accessible diagnostic tool for biomedical research. A detailed analysis of posture changes during skilled mo- tor tasks can reveal distinct functional deficits and their restoration during recovery. Our specific scenario is based on a neuroscientific study of rodents recovering from a large sensorimotor cortex stroke and skilled forelimb grasping is being recorded. Given large amounts of unlabeled videos that are recorded during such long-term studies, we seek an approach that captures fine-grained details of posture and its change during rehabilitation without costly manual supervision. Therefore, we utilize self-supervision to au- tomatically learn accurate posture and behavior represen- tations for analyzing motor function. Learning our model depends on the following fundamental elements: (i) limb detection based on a fully convolutional network is ini- tialized solely using motion information, (ii) a novel self- supervised training of LSTMs using only temporal permu- tation yields a detailed representation of behavior, and (iii) back-propagation of this sequence representation also im- proves the description of individual postures. We establish a novel test dataset with expert annotations for evaluation of fine-grained behavior analysis. Moreover, we demonstrate the generality of our approach by successfully applying it to self-supervised learning of human posture on two standard benchmark datasets.

Citation Keybuechler:CVPR:2017