dorothea_round
Dorothea Wendt

Scientist, PostDoc

mail@eriksholm.com

If a hearing aid can decode where the user wishes to attend, it will be possible to control what sounds you want to amplify only by attending to them. The COCOHA project examined how to decode the users’ intent.

Cognitive Control of a Hearing Aid (COCOHA) was an EU Horizon 2020 project running from January 1st 2015 to 31st of December 2018. The consortium of the project consisted of Ecole Normale Superior in Paris, France, Danish Technical University (DTU) in Lyngby, Denmark, University Hospital in Zürich, Switzerland, University College London, UK, and Eriksholm Research Centre.

The COCOHA project aimed at creating a basis for hearing aids, which can be controlled by the intent of the user.

The core idea of the project was based on classification of decoded auditory attention as shown by James O’Sullivan et. al. in 2015.
Additionally, a fall-back solution was formulated, where the hearing aid would select the audio stream of visual attention. Hence, two final demonstrators were formulated:

Demonstrator 1: Real-time demonstrator consisting of attention modulation based on brain waves from scalp EEG or Ear-EEG to decode who the attended speaker is and amplify this voice.

Demonstrator 2: Real-time demonstrator consisting of attention modulation based on eye-gaze from
Ear-EOG sensors and motion trackers to estimate head orientation, to amplify the voice of the person the subject gazed at.


morten

See how a person with hearing impairment experiences the system with eye-gaze here.

 

The project was divided into six scientific work packages

Work package 1 was mainly handled by DTU, and the main result of the work was a Matlab-based tool to simulate speech in acoustic rooms with different reverberation and background noise. This is an important tool for hearing aid evaluation and controlled research setups.

Work package 2 was mainly handled by Ecole Normale Superior, and the main result of the work was a reduction of the decoding error from 20 percent to 3 percent for 5 seconds speech segments.

Work package 3 was mainly handled by University College London and work package 4 was mainly handled by DTU. Work package 3 focused on brain responses from normal hearing subjects, and main results were obtained in understanding involuntary shifts of attention (distraction), the effect of cognitive load on auditory processing, and gaze location.

Work package 4 focused on brain responses from hearing-impaired subjects, and interestingly it was demonstrated, that the steering worked for both normal-hearing and hearing-impaired listeners in a closed-loop system.

Work package 5 was mainly handled by University Hospital Zürick, and main results were a multi-microphone platform combined with general acoustic scene analysis, which allows for localization and streaming of different speakers.

Work package 6 was mainly handled by Eriksholm Research Centre and consisted of integrating findings from all other work packages into a real-time prototype hearing aid consisting of behind-the-ear shells with microphone, dry electrodes (without amplifiers) and processing unit. Additionally, this work package also developed the eye gaze steering algorithm needed for Demonstrator2. 


  • workpackages-cocoha

At the end of the project, knowledge from all work packages were integrated into the real-time prototype and tested on hearing-impaired subjects. Although the technology still needs to be matured, cognitive control of a hearing aid seems to be a valuable approach in the future to improve user benefits of hearing aids.

Read more about COCOHA om the project website: https://cocoha.org/


  • eu_flag_lille