Faculty of Science
Permanent URI for this collection
Browse
Browsing Faculty of Science by Subject "Computer Engineering"
Now showing 1 - 1 of 1
Results Per Page
Sort Options
- ItemEye Gaze Direction for Human-Computer Interaction(University of Peradeniya, 2011-11-24) Gunarathne, M; Bandara, R; Elkaduwe, D; Ragel, RHuman Computer Interaction (HCI) is a technique used for interaction between users and computing systems which occurs through user interfaces (including both hardware and software) of a computing system. Hel devices can generally be divided into two: the input peripherals and the output peripherals. The objective of this study is to describe a new technique for the former. Although new Hel techniques and devices are being introduced and deployed in computing systems for input peripherals (from different types of keyboards to mice and touch screens), they essentially failed to address the problems encountered by a group of users who have difficulties (such as disabilities due to accidents, aging, etc.) in using their hands for handling such devices. Thus, the present research focuses on enabling the disabled to use computing systems through a new Hel technique, namely, eye gaze direction (i.e., a steady, fixed look; or simply, 'where we are looking'). 121 As technology evolves rapidly, all individuals should benefit, irrespective of whether they are disabled or not. Unfortunately, a disabled person, who does not have the ability to move his I her hands (due to spinal cord injuries, brain injuries, multiple sclerosis, strokes, etc.), cannot use a typical computer due to inability in moving the mouse. Therefore, we propose to use eye-gaze direction as an input method to the computer. In such a setup, the eye-gaze direction of the user will be detected by an eye-gaze direction monitor, used as input to the computer. This will be useful not only for disabled persons, but as an alternative input method in environments such as gaming consoles. We built a low-cost, nonintrusive system which can be used to move the mouse pointer through eye-gaze direction and perform mouse-actions. To achieve these goals, typical eye-gaze tracking systems use methods such as optics, electronics, mechanics, etc. In our system, we used a mounted camera (a low cost webcam) in front of the user to capture the images, which are then sent to the computer. The computer processes these images and determines the eye gaze direction and tgestures necessary to control mouse- actions such as single-clicking, double-clicking and zooming. Although a few similar systems exist, they are very expensive (as they use special-purpose eye-gaze direction trackers, such as a high priced camera) and have limited functionalities. There are obstacles when designing and implementing eye-gaze tracking systems due to trade-offs between requirements. The major challenges of such a system; ensuring smooth movement of the mouse pointer as the eye-gaze direction changes and real-time responsiveness of the system; were overcome in our project. We overcame drawbacks in the existing eye-gaze tracking systems by enhancing accuracy of the eye- gaze direction detection and by improving the efficiency of the existing algorithms. Effective image processing techniques and haar-like features were used to achieve our final goal, which is to help the disabled to cope with and to utilise the latest technologies