A brain-computer interface would need to be connected to millions of neurons in order to match the amount of information a computer screen can display, and the technology to do that doesn’t exist yet as far as I know.
I took a brief look at the current state of such connections for our Coalescing Minds paper:
3.1. Direct brain-to-brain connections. The easiest approach seems to be to connect human
brains directly in much the same way as the two brain hemispheres are connected. The corpus
callosum, which connects the hemispheres comprises 200–250 million axons crossing from
one hemisphere to the other. It seems likely that to coalesce minds, the number of connections
should be of a similar order of magnitude, probably at least millions.
The technology exists today for creating hundreds of connections: e.g. Hochberg et al. [2006]
used a 96-microelectrode array which allowed a human to control devices and a robotic
hand by thought alone. Cochlear implants generally stimulate the auditory nerve with 16-22
electrodes, and allow the many recipients to understand speech in every day environments
without needing visual cues [Peterson et al. 2010]. Various visual neuroprostheses are currently
under development. Optic nerve stimulation has allowed subjects to recognize simple patterns
and localize and discriminate objects. Retinal implants provide better results, but rely on
existing residual cells in the retina. [Ong & da Cruz, 2011] Some cortical prostheses have also
been recently implanted in subjects. [Normann et al. 2009] We are still likely to be below the
threshold of coalescing minds by several orders of magnitude. Nevertheless, the question is
merely one of scaling up and improving current techniques.
Yes, and even if it was possible, hooking up all those wires to the visual cortex wouldn’t have much or any advantage over just using a traditional display. A computer monitor is a cybernetic connection, it just makes a virtual connection using light passing through the retina instead of a bunch of messy wires.
Of course there are other types of connections a BCI could potentially enable that would have real uses, but a better visual display is not one of them.
In terms of passing information to the brain, yes, it is. It excites neurons in a specific pattern in such a way as to form certain connections in the brain. It does this through cells in the retina, and the information does pass through a specific set of filters before it reaches the cortex, but I don’t think that is an important distinction. To give an example, one of the things a friend of mine is working in the lab next door is inserting a channelrhodopsin gene into the visual cortex of monkeys. Channelrhodopsin is the protein in retinal cells that cause them to fire in response to light. By inserting it in other neural tissue, we can cause specific cells to fire by shining light of specific frequencies onto the cell. It’s cool stuff, and I would put money on it becoming a dominant form of BCI in the mid term, at least for getting information into the brain.
The reason I bring this up is that it is using exactly the same mechanism that the retina uses, it just bypasses a few of the neural filtering mechanisms. Filters are incredibly useful, and while, in the future, we may want some of our connections to be directly into the cortex, we also might want to take advantage of some of those systems. To call one a cybernetic interface, and not the other, seems to be arbitrary.
Yes, this does mean that every human-computer interface is, in the end, a brain-computer interface with just an extra informational filter in between. It also means that every human interaction is brain-to-brain, again with just extra filters in place. I’m OK with that. I also find the idea very aesthetically pleasing, but that has no weight on whether it is true. When we talk about communication, cybernetics, and interfaces, it may be useful to distinguish between what filters are in place and how those effect the signal, but everything is eventually (interface/brain)-to-brain.
We all know how computer monitors work. We roughly speaking know that the information from the computer ends up processed in the brain in the visual cortex. But we can still tell the difference between a computer monitor and matrix headjack.
And DON’T EVEN GET ME STARTED on people who think Wikipedia is an “Artificial Intelligence”, the invention of LSD was a “Singularity” or that corporations are “superintelligent”!
Could you give a definition of cybernetics that does not include both? Cybernetics, as a word, has two different meanings. First is the study of the structure of regulartory systems. This, in regards to electronics, is where I believe it got its second meaning, which is much fuzzier. Most of us have an image of a Neuromancer style biomechanical ninja when we hear it, but have nothing in the way of a set definition. In fact, it appears normative, referring to something that is futuristic. This, of course, changes. Well designed mechanical legs that let you run faster than an Olympic sprinter would easily have been called cybernetics in the 60s. Now, because that’s here, my impression is that people are more hesitant to call it that.
Do we draw the cybernetic/non-cybernetic line at something that physically touches neural tissue? Or projects light on it? Or induces changes in it with magnetic stimulation? Does it have to interface with neurons, or do glia count too? Muscle cells? Rods and cones? If we have a device that controls hormones in the blood, is that cybernetic? I understand your point about not overgeneralizing, and I tried to include that in response. Cybernetics, if it is to mean anything and not be an ambiguous rube/blegg as we discover more, has to be thought of as being heavily related to information processing in the brain. Filters are incredibly important. In an information processing system, they are almost everything. But in terms of getting information into the brain, the difference between a cortical brainjack and a monitor is what type of filters are in their way. Those filters can be broken down into incredibly complex systems that we can and should distinguish, but that’s the proper conceptual framework with which to look at the problem.
Maybe the cheapest way to achieve high-bandwidth brain-to-brain communication is to create an artificial body part that can output neural data in a visual format with very high bandwidth (like a cross between a chameleon’s skin and a computer monitor).
I was trying to answer your question about “bandwidth”. The problem is the total amount of information that can be sent to or from the brain per unit time. The usefulness of BCI will be rather limited until it comes within the ballpark of what present computer interfaces (or even interfaces of the 1980s) can do in terms of information throughput.
A brain-computer interface would need to be connected to millions of neurons in order to match the amount of information a computer screen can display, and the technology to do that doesn’t exist yet as far as I know.
I took a brief look at the current state of such connections for our Coalescing Minds paper:
Yes, and even if it was possible, hooking up all those wires to the visual cortex wouldn’t have much or any advantage over just using a traditional display. A computer monitor is a cybernetic connection, it just makes a virtual connection using light passing through the retina instead of a bunch of messy wires.
Of course there are other types of connections a BCI could potentially enable that would have real uses, but a better visual display is not one of them.
No it isn’t. It is a form of human-computer interface. And a spade is a spade.
In terms of passing information to the brain, yes, it is. It excites neurons in a specific pattern in such a way as to form certain connections in the brain. It does this through cells in the retina, and the information does pass through a specific set of filters before it reaches the cortex, but I don’t think that is an important distinction. To give an example, one of the things a friend of mine is working in the lab next door is inserting a channelrhodopsin gene into the visual cortex of monkeys. Channelrhodopsin is the protein in retinal cells that cause them to fire in response to light. By inserting it in other neural tissue, we can cause specific cells to fire by shining light of specific frequencies onto the cell. It’s cool stuff, and I would put money on it becoming a dominant form of BCI in the mid term, at least for getting information into the brain.
The reason I bring this up is that it is using exactly the same mechanism that the retina uses, it just bypasses a few of the neural filtering mechanisms. Filters are incredibly useful, and while, in the future, we may want some of our connections to be directly into the cortex, we also might want to take advantage of some of those systems. To call one a cybernetic interface, and not the other, seems to be arbitrary.
Yes, this does mean that every human-computer interface is, in the end, a brain-computer interface with just an extra informational filter in between. It also means that every human interaction is brain-to-brain, again with just extra filters in place. I’m OK with that. I also find the idea very aesthetically pleasing, but that has no weight on whether it is true. When we talk about communication, cybernetics, and interfaces, it may be useful to distinguish between what filters are in place and how those effect the signal, but everything is eventually (interface/brain)-to-brain.
[edited for typo]
No, it isn’t.
We all know how computer monitors work. We roughly speaking know that the information from the computer ends up processed in the brain in the visual cortex. But we can still tell the difference between a computer monitor and matrix headjack.
Could you give a definition of cybernetics that does not include both? Cybernetics, as a word, has two different meanings. First is the study of the structure of regulartory systems. This, in regards to electronics, is where I believe it got its second meaning, which is much fuzzier. Most of us have an image of a Neuromancer style biomechanical ninja when we hear it, but have nothing in the way of a set definition. In fact, it appears normative, referring to something that is futuristic. This, of course, changes. Well designed mechanical legs that let you run faster than an Olympic sprinter would easily have been called cybernetics in the 60s. Now, because that’s here, my impression is that people are more hesitant to call it that.
Do we draw the cybernetic/non-cybernetic line at something that physically touches neural tissue? Or projects light on it? Or induces changes in it with magnetic stimulation? Does it have to interface with neurons, or do glia count too? Muscle cells? Rods and cones? If we have a device that controls hormones in the blood, is that cybernetic? I understand your point about not overgeneralizing, and I tried to include that in response. Cybernetics, if it is to mean anything and not be an ambiguous rube/blegg as we discover more, has to be thought of as being heavily related to information processing in the brain. Filters are incredibly important. In an information processing system, they are almost everything. But in terms of getting information into the brain, the difference between a cortical brainjack and a monitor is what type of filters are in their way. Those filters can be broken down into incredibly complex systems that we can and should distinguish, but that’s the proper conceptual framework with which to look at the problem.
Maybe the cheapest way to achieve high-bandwidth brain-to-brain communication is to create an artificial body part that can output neural data in a visual format with very high bandwidth (like a cross between a chameleon’s skin and a computer monitor).
I was trying to answer your question about “bandwidth”. The problem is the total amount of information that can be sent to or from the brain per unit time. The usefulness of BCI will be rather limited until it comes within the ballpark of what present computer interfaces (or even interfaces of the 1980s) can do in terms of information throughput.