A critical knowledge gap in brain-computer interfaces (BCI) is an intuitive interface between humans and AI to convey human intention and AI decision logic.
Natural brain-computer interface (nBCI)
Breaking the barrier: towards a more intuitive and seamless brain-computer interface
Talk to HAI about your upcoming project
BCI enables the brain to interact directly with a computer or machine. Current non-invasive EEG-based BCI technology relies on unnatural designed stimuli (e.g. looking at flickers or flashing photos) or trained thoughts (e.g. to think left-hand movement or to blink twice). It cannot read the brain directly to sense what the human is naturally seeing and thinking. Electrocorticography signals, sensed by invasive intracranial measurements, have shown some ability to ‘read’ speech and ‘see’ the object of focus in the mind’s eye. However, this is not suitable for many applications.
HAI will develop a hands-free, non-invasive nBCI that creates a natural and intuitive link between brain and machines. nBCI will be able to understand ‘silent speech’, i.e. what a user is ‘saying’ when it is still a thought, and be able to ‘see’ the object that a user is focused on. nBCI has strong prospects to be a disruptive technology that can be used in everyday life, replacing current human-computer and human-machine interfaces such as keyboards, touchscreens and hand-gesture recognition. nBCI will be foundational for wearable computers and devices, creating a direct link with a wearer’s brain.