My Imperial College Masters project supervisor was Bob Spence, someone quite well known in the ‘HCI’ (Human Computer Interaction) domain, and a former head of Electrical Engineering. The apparatus at my disposal can track a persons eye gaze, and has an API to allow one to incorporate its functionality, typically used for research into interface design and gaze patterns, in a custom application.
The title of the project was quite open – ‘Gaze Controlled Navigation’ – and Bob encouraged me to generate ideas and explore possibilities rather than worry too much about past work or specific lines of investigation. I came up with a few ideas to do with using gaze to navigate around high resolution image systems such as Google Earth, and out of that a project was born. Below is the abstract, and link to the full project (60+ pages) in PDF format.
Webcams attached to personal computers could soon have capability to detect the location of a users gaze. Furthermore, such technology currently exists, and although expensive, could be used to aid a user in certain specialised tasks when used as an input. Therefore, this project looks at the HCI issues related to using gaze direction in visual interface control. The specific context of this investigation is in the visual search, navigation and browsing of very high resolution images, for example those obtained in medicine and astronomy.
Two novel gaze control methods were developed to allow handsfree zooming and panning of such images – ‘staring to zoom’ (STZ) and using ‘head to zoom’ (HTZ), which required the user to stare at a point of interest or tilt their head slightly like a joystick to zoom in the image respectively. Following successful initial user tests of the methods, both were developed further, and integrated as input methods to Google Earth – used to simulate an image navigation platform. A third gaze control idea which incorporated both mouse and gaze input, called ‘dual to zoom’ (DTZ) was also implemented.
An experiment was conducted in order to objectively compare HTZ, STZ and DTZ with a conventional mouse input, ‘mouse to zoom’ (MTZ). As well as collecting objective data while users performed specific tasks, users were asked to give feedback about their experience using likert scale tests. MTZ was found to be much more accurate and a bit faster than the gaze controlled methods, but the results from DTZ in particular were promising, as it was the only gaze control method which subjects did not report gave them significantly more fatigue than the mouse. Important issues associated with using gaze as an input were encountered in the course of the project. These included the filtering of gaze co-ordinates and the threshold for stare detection.
Gaze Controlled Navigation [PDF, 12.8MB, 67 pages]