Categories
technology

Creating a working brain-computer interface will be messy.

Crossing the finish line will be a messy process, but a working brain-computer interface would enable all sorts of interaction not possible today.

Share It

There are people who are dedicated to their vision, and then there are people like Phil Kennedy. Wired has a feature about the man, and it’s pretty fascinating. In a nutshell, Mr, Kennedy is a neurologist who has been working on brain-computer interfaces since the 1990s, and put himself under the knife in order to test some equipment.

To put it in blunt terms, he paid someone $30,000 to slice open his scalp, and shove wires into his brain, as well as implant some electrodes. He then went home and recorded his own brain activity while speaking, in an attempt to refine his idea of having an interface that would allow someone to communicate speech with their thoughts (to be used by, for example, people with injuries or other issues that prevented them from communicating verbally).

The idea, generally speaking, is nothing new. We’ve known since the 1800s that there is electrical activity in the brain, and where there is electrical activity, there theoretically should be the ability to read or manipulate it.  When computers came along in the 1950s and 1960s, there were, almost as quickly, efforts at using the brain as a control interface.

Let’s think about a use case. As a programmer, it’s easy to see why such a device would hold appeal. How do we have to code, today? We have to sit somewhere, and type into a keyboard.  This is actually a fairly easy process, and one that we can do almost as quickly as speech. But I know from experience that typing moves nowhere near as fast as my brain, and it is also fraught with problems.

Our body does this stuff fast enough that we don’t think about it much in everyday life, and also, we’re just used to our limitations. For us, “it is what it is”, but think about what’s actually happening here: first, we programmers conceive, in our brain, what abstract structure we think will work in our code. We have to translate the ideas into whatever language we’re using (not unlike the process of translating your thoughts to be used in French vs. Spanish, or whatever). We then need to fire electronic pulses from our brain down the arms and into the fingers for each component of each programmatic structure (the smallest unit of which is a letter or number). At the same time, we have to follow along and try to process the output (did our fingers fully strike the keys?) and be on the lookout for typos, errors, etc.

Oh, and errors? There are many. It’s anecdotal, but I am certain that more than 50% of coding is fixing errors (and I suspect the number to be much higher). Sometimes, these errors are conceptual, meaning we can’t blame them on the mechanical process of translating thoughts into finger movements. Part of programming is that it is a process of trial and error, so errors in judgment or strategy are to be expected. But there are another class of mistakes entirely, insidious ones that can result from simple mechanical miscues. Either you inadvertently type the wrong key (or forget to type a key), or, because your brain is moving more quickly than your fingers, you proceed quickly to get the idea out before it flutters away and are left with issues in the code which you have to go back over and double check. Sometimes these errors can be so minute that they don’t become apparent until the code is actually run.

So, as a coder, it is frustrating that the latter class of errors exist at all. If we could eliminate the physical and mechanical roundtrip of data from brain to extremities, we could reduce or eliminate much of the minor errors in our code. This is one reason why a brain-computer interface would be a boon for programmers (not to mention writers and so on). If it could (hopefully) perfectly capture your intent (in ways your fingers can’t), then at the least you would only be left with conceptual errors, which are the more interesting ones to think about, test and examine, and the ones which are more impactful to the quality of the end result. We could spend more time optimizing structures for performance instead of hunting and rooting out syntactical errors.

There would be other advantages. As it is today, programmers have to park their asses on a chair somewhere to work. This is another aspect of coding that would be improved with an accurate brain-computer interface. Not only is it a drag to have to put yourself in a particular location to work, but sitting is generally bad for your health, no matter whether you are working or watching TV.

Besides programming, another activity I engage in is running. At the moment, I am training for a marathon, and therefore an awful lot of my time is wasted. You could argue spending time getting fit isn’t a waste (although extreme running like marathoning is probably not good for you). When you are fit you generally have more energy to work or be productive in other ways. Still, the actual time spent during running is not productive in terms of getting things done.

When I’m in that zone, legs relentlessly moving, there’s little to do but think, and it’s often then that I can work out issues with my work. What if I could actually be coding during this time (or even just doing something simpler like chatting with a friend)? There are other similar moments, like when you’re out at a bar having a few pints with friends, and suddenly an elegant solution for some nagging structural problem pops into your head…it would be great to be able to save it, without even leaving the bar.

So a well-functioning brain-computer interface would be a boon to all sorts of people. Probably most of us don’t want someone jamming wires into our brains (indeed, most current efforts are centered around external devices). But, intrepid souls like Phil Kennedy are needed along the route to progress. Even if their efforts alone don’t crack the nut, they add to the body of knowledge that will allow someone else to finally make the pieces fit together.

Share It