Object Recognition for Wearable Visual Robots

Only available on StudyMode
  • Download(s): 14
  • Published: April 26, 2013
Open Document
Text Preview
Transfer Report: Object Recognition for Wearable Visual Robots

Robert Castle Wolfson College

Robotics Research Group Department of Engineering Science University of Oxford Michaelmas Term 2006

This transfer report is submitted to the Department of Engineering Science, University of Oxford. This transfer report is entirely my own work, and, except where otherwise indicated, describes my own research.

Abstract
An autonomous wearable visual robotic assistant is a robot that is worn by a user, and it consists of several parts. Mechanically it consists of at least one camera that may be free to move via a motor assembly, possibly some other sensors such as an inertia sensor, some way of interacting with the wearer, such as a display or audio, and a portable computer. Its can be used to complete a wide range of designated tasks, from simple sensor recording to providing timely, useful information about a user’s environment, and helping to guide a user to a goal or to complete a task. A wearable robot would be of great use in many fields, such as the military, maintenance, emergency relief work, and tourism. A wearable visual robot that can move semi-independently to its wearer allows it to be able to complete a wider range of tasks than a fixed camera, such as focusing on what a user is doing or looking around a user’s workspace for objects while the user continues to work. To enable a robot to provide a user with useful information about the world, the robot needs to be able to identify objects that are of interest. It then needs to be able to track its location in the world and maintain a map of where the objects are to be able to direct a user to them. To be able to identify objects in an image we review the work done in the field of object detection and select SIFT (scale invariant feature transform) as an object detector. To enable the robot to keep track of its position in the world and create a map of features it has observed we review the literature on SLAM (simultaneous localization and mapping), and in particular the EKF (extended Kalman filter) that is used on a single camera SLAM system that we adopt for use with our system. Finally we review the work that has been done on visual wearables, including work done on analysing the optimal location of an active camera to enable it to be useful in a wide range of tasks. In our work we show that by integrating an object detector into the single camera SLAM system, a database of planar objects can be identified in an image, then localized in 3D relative to the camera and added to a map. This allows the system to keep track of multiple objects and label them on a display as the camera is moved freely. This work provides the foundation for building an autonomous wearable robotic assistant that can provide assistance by identifying and tracking objects in the world and informing a user about them. From here, a wearable will be constructed, improvements to the tracking system using objects will be implemented, and interaction with the user using audio added.

Contents
1 2 Introduction Literature review 2.1 Object detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.1.1 2.1.2 2.1.3 2.1.4 2.1.5 2.1.6 2.2 2.2.1 2.2.2 2.2.3 2.2.4 2.3 2.4 3 Feature detectors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Feature descriptors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Descriptor evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Matching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Randomised trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Applying object detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . Varieties of SLAM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Extended Kalman filter based SLAM . . . . . . . . . . . . . . . . . . . . . . Single camera SLAM . . . . . . . . . . . . . . . ....
tracking img