Humans have bred dogs from wolves. There are some dogs that have language and problem solving skills that are comparable with human children. They also have a friendly attitude to humans. Dogs are our first AI. An uplifted animal is another way AI can happen.
Direct brain to brain communication migh produce a meta-consciousness not found in the original brains.
I do remember reading that domesticated dogs have less intelligence than wolves. According to whatever tests they use to measure animal intelligence. However dogs are better at understanding cues given by humans. Not necessarily at the level of human children though.
Direct brain to brain communication migh produce a meta-consciousness not found in the original brains.
How? We already have brain to brain communication, and it’s called language.
This could work with somekind of human-machine connection as well. I remember reading a paper in computational neuroscience, where they hooked an eels brain to a simple machine and created a loop of machineinput—eelinput—eeloutput—machineoutput. So the eel received perceptual information from the robot and then gave actions to the robot to move.
I have no science, only science fiction, ideas of how it could be done. What I am thinking of are two or more people who are communicating without speech, writing, gesture, eye contact, or in other conventional ways. Instead, a thought in one person’s body is shared / perceived in another person’s body. I think of a red fire truck and either you know I’m thinking of a red fire truck or you also think of a red fire truck, by some human-created non-conventional way. I can only guess it would be partially direct wiring between brains, partially sensors that detect and transmit / reproduce chemical and electrical changes in brains. I know some small amount of brain monitoring and brain wiring is possible now, but I make no claim a full brain to brain dialogue can ever happen. I’d like it to, maybe it will, I do not claim to know.
If there a machine that determine that a person thinks of a red fire truck and then stimulates the neurons in the brain of another person, that’s not direct. The machine is in the middle.
The machine needs an internal language in which it can model “red fire truck”, be able to recognize that in Alice by looking at neuron firing pattern and then have a model of what neuron firing would likely to have the effect of something like a “red fire truck” be perceived by another person.
Given those translation issues of those two changes of represenation systems I don’t see why I would call the process “direct”.
To be fair you could also imagine a setup without such internal interpretation, with just signal X in electrode 1 causing signal X to be disgorged in electrode 2. It would then be up to the brains on either end to learn how to modulate/interpret this new channel. People can and do learn to use such new channels all the time whenever they are provided.
I want trying if failing to convey some realization of ‘two heads are better than one.’ Not an AI in the interfacing machine, but a consciousness that is neither of the two people connected. A self-awareness not found in either of the two connected people. It’s not Alice and it’s not Bob but is partially in Alice, partially in Bob and perhaps partially in their connection. The way two sounds with just the right frequency can produce a third sound when they overlap.
Humans have bred dogs from wolves. There are some dogs that have language and problem solving skills that are comparable with human children. They also have a friendly attitude to humans. Dogs are our first AI. An uplifted animal is another way AI can happen.
Direct brain to brain communication migh produce a meta-consciousness not found in the original brains.
I do remember reading that domesticated dogs have less intelligence than wolves. According to whatever tests they use to measure animal intelligence. However dogs are better at understanding cues given by humans. Not necessarily at the level of human children though.
How? We already have brain to brain communication, and it’s called language.
And the meta-consciousness is called culture.
This could work with somekind of human-machine connection as well. I remember reading a paper in computational neuroscience, where they hooked an eels brain to a simple machine and created a loop of machineinput—eelinput—eeloutput—machineoutput. So the eel received perceptual information from the robot and then gave actions to the robot to move.
What does “direct” mean? Synapses are linked through a wire?
I have no science, only science fiction, ideas of how it could be done. What I am thinking of are two or more people who are communicating without speech, writing, gesture, eye contact, or in other conventional ways. Instead, a thought in one person’s body is shared / perceived in another person’s body. I think of a red fire truck and either you know I’m thinking of a red fire truck or you also think of a red fire truck, by some human-created non-conventional way. I can only guess it would be partially direct wiring between brains, partially sensors that detect and transmit / reproduce chemical and electrical changes in brains. I know some small amount of brain monitoring and brain wiring is possible now, but I make no claim a full brain to brain dialogue can ever happen. I’d like it to, maybe it will, I do not claim to know.
If there a machine that determine that a person thinks of a red fire truck and then stimulates the neurons in the brain of another person, that’s not direct. The machine is in the middle.
The machine needs an internal language in which it can model “red fire truck”, be able to recognize that in Alice by looking at neuron firing pattern and then have a model of what neuron firing would likely to have the effect of something like a “red fire truck” be perceived by another person.
Given those translation issues of those two changes of represenation systems I don’t see why I would call the process “direct”.
To be fair you could also imagine a setup without such internal interpretation, with just signal X in electrode 1 causing signal X to be disgorged in electrode 2. It would then be up to the brains on either end to learn how to modulate/interpret this new channel. People can and do learn to use such new channels all the time whenever they are provided.
I think that’s still substantially about learning a new language in which to communicate and not just transmitting existing thoughts as is.
I want trying if failing to convey some realization of ‘two heads are better than one.’ Not an AI in the interfacing machine, but a consciousness that is neither of the two people connected. A self-awareness not found in either of the two connected people. It’s not Alice and it’s not Bob but is partially in Alice, partially in Bob and perhaps partially in their connection. The way two sounds with just the right frequency can produce a third sound when they overlap.