Eye Movement Analysis for Activity Recognition Using Electrooculography Andreas Bulling, Student Member, IEEE, Jamie A. Ward, ¨ Hans Gellersen, and Gerhard Troster, Senior Member, IEEE Abstract—In this work we investigate eye movement analysis as a new sensing modality for activity recognition. Eye movement data was recorded using an electrooculography (EOG) system. We ﬁrst describe and evaluate algorithms for detecting three eye movement characteristics from EOG signals - saccades, ﬁxations, and blinks - and propose a method for assessing repetitive patterns of eye movements. We then devise 90 different features based on these characteristics and select a subset of them using minimum redundancy maximum relevance feature selection (mRMR). We validate the method using an eight participant study in an ofﬁce environment using an example set of ﬁve activity classes: copying a text, reading a printed paper, taking hand-written notes, watching a video, and browsing the web. We also include periods with no speciﬁc activity (the NULL class). Using a support vector machine (SVM) classiﬁer and person-independent (leave-one-person-out) training, we obtain an average precision of 76.1% and recall of 70.5% over all classes and participants. The work demonstrates the promise of eye-based activity recognition (EAR) and opens up discussion on the wider applicability of EAR to other activities that are difﬁcult, or even impossible, to detect using common sensing modalities. Index Terms—Ubiquitous computing, Feature evaluation and selection, Pattern analysis, Signal processing.
UMAN activity recognition has become an important application area for pattern recognition. Research in computer vision has traditionally been at the forefront of this work , . The growing use of ambient and body-worn sensors has paved the way for other sensing modalities, particularly in the domain of ubiquitous computing. Important advances in activity recognition were achieved using modalities such as body movement and posture , sound , or interactions between people . There are, however, limitations to current sensor conﬁgurations. Accelerometers or gyroscopes, for example, are limited to sensing physical activity; they cannot easily be used for detecting predominantly visual tasks, such as reading, browsing the web, or watching a video. Common ambient sensors, such as reed switches or light sensors, are limited in that they only detect basic activity events, e.g. entering or leaving a room, or switching an appliance. Further to these limitations, activity sensing using subtle cues, such as user attention or intention, remains largely unexplored. A rich source of information, as yet unused for activity recognition, is the movement of the eyes. The movement patterns our eyes perform as we carry out speciﬁc activi-
ties have the potential to reveal much about the activities themselves - independently of what we are looking at. This includes information on visual tasks, such as reading , information on predominantly physical activities, such as driving a car, but also on cognitive processes of visual perception, such as attention  or saliency determination . In a similar manner, location or a particular environment may inﬂuence our eye movements. Because we use our eyes in almost everything that we do, it is conceivable that eye movements provide useful information for activity recognition. Developing sensors to record eye movements in daily life is still an active topic of research. Mobile settings call for highly miniaturised, low-power eye trackers with real-time processing capabilities. These requirements are increasingly addressed by commonly used video-based systems of which some can now be worn as relatively light headgear. However, these remain expensive, with demanding video processing tasks requiring bulky...