Goal

In the framework of the AICASys project a gaze-based control system is developed for controlling a conventional electrical powered wheelchair on several behavioral levels. While in former work the focus was on the estimation of the driver’s intention on the motion and position levels in the current research project AICASys the focus is on the task execution level. 

Description

The system is integrated in the smart-home system using additional context information from the environment. Analyzing the gaze behavior directed to the natural objects in the environment, the driver is able to control actions of the wheelchair and the smart-home environment in a seamless way, while the user is assisted by semi-autonomous functions. Figure 1 shows the scenario for gaze-based door control and driving though. The user is wearing an enhanced gaze-tracking glasses for augmenting the real user perceptible scene with virtual controls. The control system enables the user to interact with the wheelchair and devices in the environment such as door openers, roller shutter and lights using natural gaze behavior.

The project is funded by the BMBF under grant (16SV7181K) “IKT 2020 – Forschung für Innovationen”.

The project partners are: Automation Laboratory, ZITI, Heidelberg University (Coordinator); SensoMotric Instruments GmbH, Teltow; German Research Center for Artificial Intelligence (DFKI), Kaiserslautern; FZI Research Center for Information Technology, Karlsruhe; CIBEK GmbH, Limburgerhof.

Figure 1: Assisted Wheelchair and Smart-Home Environment.


Links  

back to top