No link to story available
Scientists working at Korea University, Korea, and TU Berlin, Germany have developed a brain-computer control interface [sciencedaily.com] for a lower limb exoskeleton by decoding specific signals from within the user's brain.
Using an electroencephalogram (EEG) cap, the system allows users to move forwards, turn left and right, sit and stand simply by staring at one of five flickering light emitting diodes (LEDs)
...
Each of the five LEDs flickers at a different frequency, and when the user focusses their attention on a specific LED this frequency is reflected within the EEG readout. This signal is identified and used to control the exoskeleton.A key problem has been separating these precise brain signals from those associated with other brain activity, and the highly artificial signals generated by the exoskeleton.
"Exoskeletons create lots of electrical 'noise'" explains Klaus Muller, an author on the paper. "The EEG signal gets buried under all this noise -- but our system is able to separate not only the EEG signal, but the frequency of the flickering LED within this signal."
Brain-scanning with EEG caps has been making appearances at Makers Faire [theverge.com] for the last couple of years. Has anyone experimented with these kinds of rigs? Are they the right interface for exoskeletons, or is there a better way?