If there’s one area of computer interface research that’s harder to comprehend than any other it has to be mind control – actually using the power of your brain alone to manipulate and control the data and instructions that a computer receives. The complexities of such a system are formidable but the potential benefits are truly immense and potentially life changing. Only a few years ago the very idea of using brainwaves to operate machinery would have been the preserve of science fiction novels or the babblings of madmen, but some significant work is being done in this field at the moment, and the results are quite staggering. One company has even decided to showcase its progress in a rather unusual fashion – by strapping it to a skateboard.
Chaotic Moon is a California based company that has already caused quite a stir in the three years since it was formed in 2010. It was responsible for a Kinect powered shopping cart protoypte which enabled shoppers to roam the aisles of a supermarket while the cart propelled itself along behind them, scanning items as they were placed into the basket. This was followed up with a Kinect controlled skateboard which the company dubbed the Board of Awesomeness. The next project though would showcase a simple but effective use of mind control principles when the designers took a headset built by a company called Emotiv – who also specialise in brain/machine interfaces – attached it to a Windows 8 tablet, and bolted them both to the aforementioned skateboard.
‘We started with very simple things like getting it to move,’ explains Whurley, Chaotic Moon’s General Manager, ‘Then trying to get it to move and stop. It was really interesting, because all we did was replace the Kinect with the USB key that talks wirelessly to the Emotiv headset. It was pretty simple as far as physical configuration, then fairly complex, cumbersome and trial and error with the software.’
The basic principles of the technology are relatively simple. When we think about something in particular our brain creates patterns of electrical activity. These patterns can be recorded using headsets such as the ones Emotiv manufacture, and then these patterns are translated into recognizable commands for a computer system to execute. Like speech though, there are still issues of compatibility.
‘The reason is your brain has folds in it, says Whurley, ‘yours are different from mine and the electrical patterns are different to mine. It’s not a magic technology where I can put it on anybody and get the exact same results all the time. What we did was literally, over hundreds of times, test different people doing different stuff, and we came up with a way we could get it to work for ninety-five percent of people. With the simple commands, not the complex commands. Things like moving forward, forward faster, slowing down, and stopping. Those were the four basic ones that we tried to get to work, and homogenise if you will, so that across everybodies brainwaves it would work. I will tell you…it is unreal how many people go absolutely bananas. They love it. People are just blown away. It’s this moment of magic and sorcery which is kind of awesome.’
Controlling something with the mind is still such a new way for humans to interact with machines that it can be hard to switch off our thoughts, something which can have unexpected results. An example of this can be seen when CNET reporter Molly Wood, while testing the Board of Imagination, nearly crashed it into a wall even though she was no longer riding it.
‘Yes, yes she did!’ Whurley exclaims. ‘The thing you’ll notice on that video is that the skateboard kept going, and the reason it did is that Molly was thinking about moving because she was chasing after it. What she didn’t understand is that by doing that she was actually driving it further and further, faster away from us. That’s why on the video you see me say stop thinking!’
Urban sports aside the potential for technologies that require only an uplink to a headset but offer the possibilities of complex operation is not something that Chaotic Moon dismiss.
‘There’s implications for people in wheelchairs,’ Whurley considers ‘there’s implications for people with disabilities of all kinds. In addition to that there’s repetitive tasks, controlling automation. So, for example, controlling brain/computer interfaces as part of a robotics control system in manufacturing or hazardous areas, and things like that. So there’s a lot of different areas you could take this and that’s what we try to do.’
The eventual end goal of a mind control system would arguably be one where a human and machine form some kind of symbiosis. It’s a long way from opening your email just by thinking about it, to a computer controlled exoskeleton that would empower a paraplegic person to walk again. But this very idea is one that Dr Miguel Nicolelis is trying to make a reality, and he has a notable deadline. At the opening ceremony of the 2014 World Cup in Brazil a young adult paraplegic will, if things go to plan, take several steps and kick a football thanks to a robotic suit which he or she will wear and control via a thought control interface. It promises to be a wholly remarkable sight, the significance of which will overshadow any of the football that will follow.
‘The idea of having a demonstration at the opening ceremony of the World Cup,’ Dr Nicolelis stated in a recent interview, ‘was basically generated by our desire to speed up the process of bringing this technology to clinical applications. I think showcasing the potential of those few steps in a prototype way is literally the kick-off of this field’.
In his book Beyond Boundaries Dr Nicolelis charts the development of this area of neuroscience and how the future could look very different if the theory becomes a reality. The current research he and his colleagues at the ‘Walk-Again Project’ are conducting involves, in very simplistic terms, implanting micro-electrode arrays into the brain itself to measure precise brain activity, accompanied by implanted microchips or ‘neurochips’. The signals are then processed and wirelessly sent to a BMI (Brain Machine Interface) which in turn translates the thoughts into commands that then powers the robotic neuroprosthesis. It sounds incredible, but the scientists remain confident that the technology will be ready for their big day and a potential audience of hundred of millions of people. In the short term (which in scientific terms is the next decade or so) the technology would be focussed on helping those with paralysis, Parkinson’s disease, and other neurological disorders. But as the technology becomes accepted and costs begin to decline Dr Nicolelis sees more mainstream applications becoming viable.
‘When we improve our ways to read brain activity with non-invasive technology,’ he concludes, ‘so technology does not require, like we do today, these small implants on the brain to read electrical signals from populations of brain cells. When we get to that level we truly will be able to liberate the brain from the physical limits of our bodies. We will be able to communicate in different ways, we will be able to control devices just by thinking. The times in which we will have to exert force or exert our own movements into the world to control devices probably will be gone.’
In a relatively short space of time our relationship with computers has gone from huge watercooled mainframes that required specialist operators, to far more powerful devices we carry in our pockets. The way we use our devices, and our expectations of them, is now beginning to alter their design, with newer and more powerful interfaces evolving to delight and surprise us. But we have only scratched the surface. For some of us it might seem almost impossible to consider a computer that has no keyboard or physical means of control. A few years from now it might be impossible to imagine that we ever needed them at all.