CHS: Small: A Hybrid Brain-Computer Interface for Behaviorally Non-Responsive Patients

  • Nam, Chang C.S. (PI)
  • Krusienski, Dean D. (CoPI)

Project Details

Description

Brain-computer interfaces (BCIs) have been explored for several years in an effort to provide communication for 'locked in' users who have the desire and mental capacity to communicate but are unable to speak, type, or use conventional assistive technologies due to severe motor disabilities, and a lot of work has gone into making this initially crude technology more practical, usable, accurate, and flexible (e.g., by improving speed of performance and providing virtual reality feedback and/or advanced device control). In this project the PI and his team turn their attention to a group with even greater need: patients who have been misdiagnosed as vegetative or minimally conscious, and without the mental ability to form messages or respond to questions. These individuals are not only unable to move but also unable to see, and are at risk of being euthanized based on the mistaken assumption that they are effectively 'brain dead' whereas it has been shown by European colleagues, using methods and equipment comparable to American hospitals, that 17-42% of such patients were in fact able to use a BCI to respond to questions. The PI worries that severely injured veterans and others might sometimes be misdiagnosed, potentially able to communicate with friends and loved ones if only some technology could more effectively assess their brain activity. The PI's goal in this project is to extend current BCI technologies to focus on assessing consciousness in this vulnerable patient population and provide, where possible, the ability to communicate.

The PI's approach is to adapt and extend conventional BCI protocols and feedback environments to work with people who cannot see and must instead rely on other modalities of stimulation. The work will involve three thrusts. First, the PI and his team will improve methods to identify brain response based only on tactile and auditory stimuli, by determining the best tactile stimulation frequency for each subject. Second, they will use 'hybrid' BCIs combining P300s and steady state somatosensory evoked potentials (SSSEPs) to elicit two different kinds of EEG signals that could improve accuracy. Finally, they will develop a new six choice BCI system tailored for these users; at present the best BCIs for these patient groups allow just two or three choices, whereas a six choice system could lead to faster communication and broader control options. Across all three of these thrusts, the team will also explore signal processing methods to improve accuracy. Human subjects experiments will be conducted across three groups: healthy blindfolded users, healthy blind persons, and patients who have been labeled as nonresponsive. Project outcomes, which will include useful non-visual BCIs that have been tested with blind persons and new methods for combining different EEG signals to improve performance, will be broadly disseminated both to the scientific community and to those involved with nonresponsive patient care at all levels.

StatusFinished
Effective start/end date1/8/1431/7/19

Funding

  • National Science Foundation: US$499,894.00

ASJC Scopus Subject Areas

  • Speech and Hearing
  • Computer Science(all)

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.