IEEE International Workshop on Machine Learning for Signal Processing (MLSP) location:Beijing, China date:18-21 September 2011
In this paper we propose an application which combines two research disciplines: object detection and brain-computer interfacing. It is in particular useful for patients suffering from a severe motor impairment which prevents them to interact with their surrounding environment. The application shows an image of e.g., the room of the patient, on a computer screen and searches for instances of certain objects in the image. When these are found, a flashing dot appears on top of them, flickering in a fixed but different frequency for each object. Meanwhile, brain-activity (EEG) is recorded. Selecting an object can then be achieved by looking at the corresponding flashing dot: the application processes the EEG-readings and identifies the frequency embedded in the signal (SSVEP decoding). Therefore it can conclude on the object the subject was looking at. In this way a patient can (re)gain interaction with his or her environment.