Tuesday, 3 December 2013
Multi sensor Data Fusion for Physical Activity Assessment
Common communication interfaces for computers or other similar equipment require the dexterity and mechanical manipulation ability of the user. However, number of people with disabilities such as those suffering of Amyotrophic Lateral Sclerosis (ALS) or upper spinal cord injury (SCI) can lose their communication and limb control capabilities, becoming locked in their own body, with low quality of life, and with frustration, anxiety and depression. Some groups have worked on providing methods to aid people with disabilities. This work presents a multi-modal interface that can be used for communication of people with disabilities. The interface is installed onboard a robotic wheelchair, and provides flexibility to choose different modalities for communication by people with different levels of disabilities. Users can use the interface through eye blinks, eye movements, head movements, by blowing or sucking a straw, and through brain signals. The interface is easy to use and has a flexible graphical user interface running on a personal digital assistant or tablet. Several experiments were carried out with healthy people and people with disabilities, and the results validate the developed interface as an assistive tool to allow communication of people with distinct levels of disability.