Being able to vocalize is one of the most essential elements of the human experience, with infants expected to start babbling their first words before they’re one year old, and much of their further life revolving around interacting with others using vocalizations involving varying degrees of vocabulary and fluency. This makes the impairment or loss of this ability difficult to devastating, as is the case with locked-in syndrome (LIS), amyotrophic lateral sclerosis (ALS) and similar conditions, where talking and vocalizing has or will become impossible.
As fun as brain-computer interfaces (BCI) are, for the best results they tend to come with the major asterisk of requiring the cutting and lifting of a section of the skull in order to implant a Utah array or similar electrode system. A non-invasive alternative consists out of electrodes which are placed on the skin, yet at a reduced resolution. These electrodes are the subject of a recent experiment by [Shaikh Nayeem Faisal] and colleagues in ACS Applied NanoMaterials employing graphene-coated electrodes in an attempt to optimize their performance.
Although external electrodes can be acceptable for basic tasks, such as registering a response to a specific (visual) impulse or for EEG recordings, they can be impractical in general use. Much of this is due to the disadvantages of the ‘wet’ and ‘dry’ varieties, which as the name suggests involve an electrically conductive gel with the former.
This gel ensures solid contact and a resistance of no more than 5 – 30 kΩ at 50 Hz, whereas dry sensors perform rather poorly at >200 kΩ at 50 Hz with worse signal-to-noise characteristics, even before adding in issues such as using the sensor on a hairy scalp, as tends to be the case for most human subjects.
In this study, they created electrode arrays in a number of configurations, each of which used graphene as the interface material. The goal was to get a signal even through human hair — such as on the back of the head near the visual cortex — that would be on-par with wet electrodes. The researchers got very promising results with hex-patterned epitaxial graphene (HEPG) sensors, and even in this early prototype stage, the technique could offer an alternative where wet electrodes are not an option.
While the subject is complex, brain-computer interfaces don’t have to be the sole domain of research laboratories. We recently covered an open hardware Raspberry Pi add-on that can let you experiment with detecting and filtering biosignals from the comfort of your own home.
One day in the future, we may interact with our electronic devices not with physical input or even voice commands, but simply by thinking about what we want to do. Such brain–computer interfaces (BCIs), combined with machine learning, could allow us to turn our ideas into reality faster and with less effort than ever before — imagine being able to produce a PCB design simply by thinking about how the completed circuit would work. Of course as an assistive technology, BCIs would be nothing less than life-changing for many.
Today BCIs are in their infancy, but that doesn’t mean there isn’t room for hackers and makers to experiment with the concept. [Ildar Rakhmatulin] has been working on low-cost open source BCIs for years, and with the recent release of his PiEEG on Crowd Supply, thinks he’s finally found an affordable solution that will let individuals experiment with this cutting edge technology.
Implemented as a shield that can be connected to a Raspberry Pi 3 or 4, the PiEEG features 8 channels for connecting wet or dry electrodes that can measure biosignals such as those used in electroencephalography (EEG), electromyography (EMG), and electrocardiography (ECG). With the electrodes connected, reading these biosignals is as easy as running a Python script. While primarily designed for neuroscience experimentation, [Ildar] says the device is also useful for learning more about signal processing, filters, and machine learning.
When it comes to something as futuristic-sounding as brain-computer interfaces (BCI), our collective minds tend to zip straight to scenes from countless movies, comics, and other works of science-fiction (including more dystopian scenarios). Our mind’s eye fills with everything from the Borg and neural interfaces of Star Trek, to the neural recording devices with parent-controlled blocking features from Black Mirror, and of course the enslavement of the human race by machines in The Matrix.
And now there’s this Elon Musk guy, proclaiming that he’ll be wiring up people’s brains to computers starting next year, as part of this other company of his: Neuralink. Here the promises and imaginings are truly straight from the realm of sci-fi, ranging from ‘reading and writing’ to the brain, curing brain diseases and merging human minds with artificial intelligence. How much of this is just investor speak? Please join us as we take a look at BCIs, neuroprosthetics and what we can expect of these technologies in the coming years.