242

Published on June 2016 | Categories: Documents | Downloads: 69 | Comments: 0 | Views: 474
of 6
Download PDF   Embed   Report

Comments

Content

Guidance of a wheelchair using electrooculography
RAFAEL BAREA, LUCIANO BOQUETE, ELENA LÓPEZ, MANUEL MAZO. Electronics Department University of Alcala. Alcalá de Henares. Madrid. SPAIN.

Abstract:- This paper involves a new method to control and guide mobile robots. In this case, to send different commands is used electrooculography techniques (EOG), so that, control is made by means of the ocular position (eye displacement into its orbit). This control technic can be useful in multiple applications, but in this work is used to guide a autonomous robot (wheelchair) as system to help to people with severe disabilities. The system consists of a standard electric wheelchair with an on board computer, sensors and graphical user interface running on a computer. Keyword: Electrooculographic potential (EOG), control system, handicapped people, wheelchair.

1 Introduction
Assistive robotics can improve the quality of life for disable people. Nowadays, there are a lot of help systems to control and guide autonomous mobile robots. All this systems allow their users to travel more efficiently and with greater ease [1]. This systems cover different areas such as mobile robot location and positioning (odometric system, GPS, etc), generation and tracking of trajectories, path planing of the environment using ultrasonics or infrared [2], artificial vision techniques, etc. This systems can work in an autonomous form or can be used as reference systems for users. However, in the last years, the applications for developing help systems to people with several disabilities are increased, and therefore the traditional systems are not valid. In this new systems, we can see: videooculography systems (VOG) [3][4] or infrared oculography (IROG) based on detect the eye position using a camera; there are several techniques based in voice recognition for detecting basic commands to control some instruments or robots; the joystick [1] (sometimes tactil screen) is the most popular technic used to control diferent applications by people with limited upper boby mobility but it requires fine control that the person may be have difficulty accomplishing. All this techniques can be apply to different people in function of their disability degree, using always the technic or techniques more efficiently for each person.

This paper reports initial work in the development of a robotic wheelchair system based in electrooculography [5]. Our system allows the users to tell the robot where to move in gross terms and will then carry out that navigational task using common sensical constraints such as avoiding collision. This wheelchair system is intented to be a general purpose navigational assistant in environments with accesible features such as ramps and doorways of sufficient width to allow a wheelchair pass. This work is based on previous research in robot path planing and mobile robotics, however, a robotic wheelchair must interact with its user, making the robotic system semiautonomous rather than completely autonomous. This paper has been divided into the following sections: section 2 describes the electrooculography technic used to register the eye movement and the eye gaze, in section 3 you can see the control system used and in 4, the wheelchair is described. In 5 shows some results and 6 puts forward the main conclusions and lays down the main lines of work to be followed in the future.

2 Electrooculographic potential (EOG)
There are several methods to sense eye movement. Most of this methods involve the use of cameras or vision systems to track some feature of the eye and using reverse geometry to determine where the user is

looking. Several systems use infrared illumination and infrared-sensitive camera to track eye movements. In this work, the goal is to sense the electrooculographic potential (EOG). Sometimes, the EOG is also known as the electronystagmographic potential (ENG). Our system, discrete electrooculographic control system (DECS), is based in record the polarization potential or corneal-retinal potential (CRP) [6]. This potential is commonly known as an electrooculogram (EOG). The EOG ranges from 0.05 to 3.5 mV in humans and is linearly proportional to eye displacement. The CPR is produced by means of hyperpolarizations and depolarizations of nervious cells in the retina. The human eye is an electrical dipole with a negative pole at the fundus and a positive pole at the cornea (Figure 1). Currently the major used of the EOG in clinic is in diagnosing vestibular and balance problems.

Fig 2.- Electrodes placement. The electrodes placed around the eyes measure the electrooculographic potential (EOG), this potential is a function of the position of the eye relative to the head. Basically, the difference between the voltages of the electrodes above and below the eye (B, C) indicates the vertical position of the eye relative to the head. The diference between the voltages of the electrodes to the left and right of the eyes (D, E) indicates the horizontal position of the eye relative to the head. The EOG signal changes approximately 20 microvolts for each degree of eye movement. In our system, the signal are sampled 10 times per second. The electrodes used are reusable Ag-AgCl biopotential skin electrodes or grass electrodes and gel is used as electrolyte. The record of EOG signal have several problems. Firstly, this signal seldom is deterministic, even for same person in different experiments.The EOG signal is a result of a number of factors, including eyeball rotation and movement, eyelid movement, different sources of artifact such as EEG, electrodes placement, head movements, influence of the luminance, etc. For this reasons, is neccesary to eliminate the shifting resting potential (mean value) because this value changes. To avoid this problem is necesary an ac diferential amplifier where a high pass filter with cutoff at 0.05 Hz and relatively long time constant is used. The amplifier used have programable gain ranging from 500,1000,2000 and 5000. Figure 3 shows the influence of the luminance in the shifting resting potential. The experiment consists in observe a point and the luminance was increasing during 2 minutes. The result shows that the EOG signal increase if the luminance increase.

Fig 1.- Dipole model of eye.

This system may be used for increasing communication and or control. The analog signal form the oculographic measurements have been turned into signal suitable for control purposes. The derivation of the EOG is achieved by means of placing two electrodes on the outerside of the eyes to detect horizontal movement and another pair above and below the right eye to detect vertical movement. A reference electrode is placed on the forehead. Figure 2 shows the electrode placement.

3 Visual control electrooculography

system

using

Fig 3.- Influence of the luminance in the shifting resting potential. Figure 4 shows the result obtained when the shifting resting potential was eliminated using an ac amplifier when the luminance conditions are changed.

The objetive of this control system is to guide an autonomous mobile robot using the positioning of the eye into its orbit by means of EOG signal (Figure 7). In this case, the autonomous vehicle is a wheelchair for disable people.

Hardware interface
LAB-PC TARGET

Hardware interface
PLCTA – NEURON CHIP

V, Ω
Encoders

Fig 7.- Wheelchair control system.

Fig 4.- Influence of the luminance in the shifting resting potential using an ac amplifier. On the other hand, Figures 5 and 6 show the EOG signal for eye displacement between -40º and 40º at increment of 10º during 5 second intervals without and with ac amplifier.

To control the robots movements can be used multiple options: interpretation of different commands generated by means of eye movements, generation of differents trajectories in functions of gaze points, etc. We are going to use the first option because it permits us to generate simple code for controlling the wheelchair using the eye placing.

TOP

Top Threshold

LEFT Left threshold Right threshold

RIGHT

Fig 5.- Changes in EOG signal due to eye movements with dc amplifier.

Bottom Threshold

BOTTOM

Fig 8.- Gaze control system. In this case, the vertical position is used for controlling the linear speed of the wheelchair. The control rules are: Fig 6.- Changes in EOG signal due to eye movements with ac amplifier. If vertical position > Top threshold V++ If vertical position < Botton threshold V--

And the horizontal position is used for controlling the angular speed: If horizontal position > Right threshold W = WPOS If horizontal position < Left threshold W = WNEG It is necessary several alarm and stop commands for dangerous situations. This codes can be generated by means the blink and alpha waves in EEG to detect when the eyelids are closed. The robotic wheelchair system must be able to navigate indoor and outdoor environments and should switch automatically between navigations modes for these environment. Therefore, all this system can be apply different navigations modes in function of their disability degree, using always the techniques more efficiently for each people. It is necessary to use different support system to avoid collisions and the robotic system can switch automatically for controlling the system in an autonomous form. For example, if the user lost the control and the system is unstable, the wheelchair should switch and obtain the control system. This work is included in a general purpose navigational assistant in environments with accesible features to allow a wheelchair to pass. This project is the SIAMO project [1]. Figure 9 shows a diagram of the different systems that can be incorporated in the robotic wheelchair system.
Safety and environment detection

4 Hardware and Software
A software program have been developed for controlling and calibrating the eye position. This software is based in Lab-Windows and is used the SILMON adquisition data system. The comunications between PC and wheelchair is made by means of PLCTA and LonWorks bus (based in a Neuron-chip) [7]. Figure 10 shows the main windows of the program; we can see the different EOG signals (vertical and horizontal) that are recorded each 0.1 s.

Fig 10.- Main windows. We can select different commands (figure 10): calibrate functions, acquire, stop, etc. When the eye positions is calculated, the eye gaze is obtained and represented in this windows in angles (figure 11).

LonWorks Bus (1)

Infrared Sensors

Navigation and sensor integration.

Ultrasonic Sensors

User-Machine Interface Central Proc. Unit Output devices: Parallel Bus Path generation & Dead Reckoning
! LED’s ! LCD displays ! Voice sintet.

Input devices: Low level control Power and motion Controllers
! ! ! ! Linear inputs Switches Breath & voice Eye movements

Fig 11.- Eye gaze. The EOG signal are procesed in the computer and send the control command to the wheelchair using a PLCTA. The Neuron-Chip receives the command and generates the command of linear speed for each wheel independently.

Encoder lines

LonWorks Bus (2)

Fig 9.- SIAMO project scheme.

5 Results
In this section, several results of guidance the wheelchair are shown.

requires about 15 minutes to learn to use this kind of systems. Figure 14 shows the wheelchair used in SIAMO project.

Fig 14.- Wheelchair used in SIAMO project.

Fig 12.- Wheelchair guidance I.

6 Conclusions
This research project is aimed towards developed a usable, low-cost assistive robotic wheelchair system for disabled people. In this work, we presented a system that can be used as a means of control allowing the handicapped, especially those with only eye-motor coordination, to live more independient lives. Eye movements require minimal effort and allow direct selection techniques, this increase the response time and the rate of information flow. Some of the previous wheelchair robotics research are restricted a particular location and in many areas of robotics, environmental assumptions can be made that simplify the navigation problem. However, a person using a wheelchair and EOG technic should not be limited by the device intended to assist them if the environment have accessible features. There are a lot of applications can be developed using EOG because this technic allows to users an free degree in the enviroment. If the eye gaze is known, it is possible to development different user interfaces to control different task: spell and speak software programs, it allows users to write a letter or a message and after that, a control system can interpreted the message and generete different commands. It is possible to generated a similar code to deaft people, etc.

Fig 13.- Wheelchair guidance II. In figures 12 and 13, the red line ∆ represents a “3-spline curve-line” that we want to follow. This trajectory is obtained using a trajectory spline generator developed in SIAMO project. On the other hand, the blue line " represents the trajectory obtained when the wheelchair is guided using EOG. Is possible to apreciate that the desired trajectory is followed with a small lateral error. Nowadays, we are not try this control with persons with disabilities but we considerated that is not difficult to learn the control commands. Learning to use this system must be done in an acquired skill. Some studies shown that disable persons usually

Acknowledgments: The autors would like to express their gratitude to the “Comision Interministerial de Ciencia y Tecnologa” (CICYT) for their support through the project TER96-1957-C03-01. [4] Face Tracking using an adaptive skin colour model. L.M. Bergasa et al. Third International ICCS Symposia on Intelligent Automation (IIA´99) and Soft Computing (SOCO´99). Genova. Italia. Junio 1999. [5] EagleEyes Project. James Gips, Philip DiMattia, Francis X. Curran and Peter Olivieri. Computer Science Department, Boston College. Chestnut Hill, Mass. USA. [6] Manual de técnicas de Electrofisiología clínica. M.C. Nicolau, J. Burcet, R.V. Rial. University of Islas Baleares. [7] LONWORKS Engineering Bulletin Contents, January 1995.

References: [1] SIAMO Project (CICYT). Electronics Department. University of Alcala. Madrid. Spain. [2] Sonar-Based Real-World Mapping and Navigation. Alberto Elfes. IEEE Journal of Robotics and Automation, Vol. RA-3, No. 3, June 1987. [3] The Eyegaze Eyetracking System. Joseph A. Lahoud and Dixon Cleveland. LC Technologies, Inc. 4th Anual IEEE Dual-Use Technologies and Aplications Conference. Suny Institute of technology at Utica/Rome, New york.

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close