Is neuroscience research underfunded? If so, I’ve been thinking more and more that trying to understand human consciousness has a huge expected value, and maybe EA should pay it more attention.
understand human consciousness has a huge expected value
Does it? What do you think would be an expected return of the discovery of the precise mechanics of consciousness? Or what if neuroscience dissolves consciousness?
Well, suppose it increases awareness of the threat of AGI, if we can prove that consciousness is not some mystical, supernatural phenomenon. Because it would be more clear that intelligence is just about information processing.
Furthermore, the ethical debate about creating artificial consciousness in a computer (mindcrime issues, etc.) would very shortly become a mainstream issue, I would imagine.
I’m not sure if intelligence and consciousness are one and the same thing, and with your words, consciousness/intelligence is information processing. If you conclude that intelligence is information processing, then this might be an aspect of the body, an attribute, in roughly the same way as consciousness. Then that aspect of the body is evolving in machines, called artificial intelligence, independent of conscious experience.
Consciousness has such a wide variety of states, whether it be mystical, religious experiences, persistent non-symbolic experiences, nonduality or even ordinary states and so forth. It’s fine that these states are seen from the perspective of neurons firing in the brain, but from the state of the beholder, it’s well, you know… maybe unsatisfactory to conclude the source is the brain? William A. Richards[1], for example, have the view that the ‘hard problem of consciousness’ is a philosophical question, and I don’t doubt many others who have experienced these states have a more open appreciation for this idea. [4]
But as a philosophical question, even with the assertion that consciousness is information processing, it could be this ‘brain being a receiver or reducing valve’ philosophical idea. Hence, creating conscious machines means inducing a reduction valve of Mind-At-Large or receiver, however you want to look at it.
Recent neuroimaging studies have sparked the light of Aldous Huxley’s philosophical idea[2] that the brain is a reducing valve for Mind-At-Large, consciousness, by showing that reductions in blood flow to certain regions of the brain with for example psychedelics lead to a more intense experience.
“As you can see here, there was a negative correlation between the blood flow to these areas and the intensity of the subjective experience by the subjects, so the lower the blood flow, the more intense the subjective experience. ” [3]
Probably the most efficient way to accelerate neuroscience research is with AGI and I wouldn’t be surprised if DeepMind’s coming AGI will be utilized for this purpose as for example Hassabis is a neuroscientist and been a strong proponent for AGI scientists.
Is neuroscience research underfunded? If so, I’ve been thinking more and more that trying to understand human consciousness has a huge expected value, and maybe EA should pay it more attention.
Does it? What do you think would be an expected return of the discovery of the precise mechanics of consciousness? Or what if neuroscience dissolves consciousness?
Well, suppose it increases awareness of the threat of AGI, if we can prove that consciousness is not some mystical, supernatural phenomenon. Because it would be more clear that intelligence is just about information processing.
Furthermore, the ethical debate about creating artificial consciousness in a computer (mindcrime issues, etc.) would very shortly become a mainstream issue, I would imagine.
I’m not sure if intelligence and consciousness are one and the same thing, and with your words, consciousness/intelligence is information processing. If you conclude that intelligence is information processing, then this might be an aspect of the body, an attribute, in roughly the same way as consciousness. Then that aspect of the body is evolving in machines, called artificial intelligence, independent of conscious experience.
Consciousness has such a wide variety of states, whether it be mystical, religious experiences, persistent non-symbolic experiences, nonduality or even ordinary states and so forth. It’s fine that these states are seen from the perspective of neurons firing in the brain, but from the state of the beholder, it’s well, you know… maybe unsatisfactory to conclude the source is the brain? William A. Richards[1], for example, have the view that the ‘hard problem of consciousness’ is a philosophical question, and I don’t doubt many others who have experienced these states have a more open appreciation for this idea. [4]
But as a philosophical question, even with the assertion that consciousness is information processing, it could be this ‘brain being a receiver or reducing valve’ philosophical idea. Hence, creating conscious machines means inducing a reduction valve of Mind-At-Large or receiver, however you want to look at it.
Recent neuroimaging studies have sparked the light of Aldous Huxley’s philosophical idea[2] that the brain is a reducing valve for Mind-At-Large, consciousness, by showing that reductions in blood flow to certain regions of the brain with for example psychedelics lead to a more intense experience.
Probably the most efficient way to accelerate neuroscience research is with AGI and I wouldn’t be surprised if DeepMind’s coming AGI will be utilized for this purpose as for example Hassabis is a neuroscientist and been a strong proponent for AGI scientists.
[1] https://www.theguardian.com/books/2015/dec/07/william-a-richards-psychedlics-entheogens-book
[2] https://www.goodreads.com/quotes/327452-aldous-huxley-compared-the-brain-to-a-reducing-valve-in
[3] https://blogs.scientificamerican.com/scicurious-brain/this-is-your-brain-on-psilocybin/
[4] https://www.psychologytoday.com/blog/unique-everybody-else/201604/psychedelic-drugs-and-the-nature-personality-change