Activity recognition and event classification are of prime relevance to any intelligent system designed to assist on the move. There have been several systems aimed at the capturing of signals from a wearable computer with the aim of establishing a relationship between what is being perceived now and what should be happening. Assisting people is indeed one of the main championed potentials of wearable sensing and therefore of significant research interest.
Our work currently focuses on higher-level activity recognition that processes very low resolution motion images (160×120 pixels) to classify user manipulation activity. For this work, we base our test environment on supervised learning of the user’s behaviour from video sequences. The system observes interaction between the user’s hands and various objects, in various locations of the environment, from a wide angle shoulder-worn camera. The location and object being interacted with are indirectly deduced on the fly from the manipulation motions. Using this low-level visual information user activity is classified as one from a set of previously learned classes.