right, but how many more orders of magnitude of hardware do we need in this case? this depends on what level of abstraction is sufficient. isn’t it the case that if intelligence relies on the base level and has no useful higher level abstractions the amount of computation needed would be absurd (assuming the base level is computable at all)?
right, but how many more orders of magnitude of hardware do we need in this case?
Probably a few less. This OB post explains how a good deal of the brain’s complexity might be mechanical work to increase signal robustness. Cooled supercomputers with failure rates of 1 in 10^20 (or whatever the actual rate is) won’t need to simulate the parts of the brain that error-correct or maintain operation during sneezes or bumps on the head.
Well, I don’t think we will ever be forced to simulate the brain at a molecular level. That possibility is beyond worst-case; as Chalmers says, it’s discordant with the evidence. The brain may not be a algorithmic rule-following signal processor (1), but an individual neuron is a fairly simple analog input/output device.
1: Though I think the evidence from neuroscience quite strongly suggests it is, and if all you’ve got against it is the feeling of being conscious then you honestly haven’t got a leg to stand on
“we can still emulate it if it is mechanical.”
right, but how many more orders of magnitude of hardware do we need in this case? this depends on what level of abstraction is sufficient. isn’t it the case that if intelligence relies on the base level and has no useful higher level abstractions the amount of computation needed would be absurd (assuming the base level is computable at all)?
Probably a few less. This OB post explains how a good deal of the brain’s complexity might be mechanical work to increase signal robustness. Cooled supercomputers with failure rates of 1 in 10^20 (or whatever the actual rate is) won’t need to simulate the parts of the brain that error-correct or maintain operation during sneezes or bumps on the head.
good reference but I mean how much more we need if we are forced to simulate at say molecular level rather than simply as a set of signal processors.
Even emulating a single neuron at molecular level is so far beyond us.
Well, I don’t think we will ever be forced to simulate the brain at a molecular level. That possibility is beyond worst-case; as Chalmers says, it’s discordant with the evidence. The brain may not be a algorithmic rule-following signal processor (1), but an individual neuron is a fairly simple analog input/output device.
1: Though I think the evidence from neuroscience quite strongly suggests it is, and if all you’ve got against it is the feeling of being conscious then you honestly haven’t got a leg to stand on
I’m playing devil’s advocate in that I don’t think the brain will turn out to be anything more than a complex signal processor.
neurons do seem fairly simple, we don’t know what’s waiting for us when we try to algorithmically model the rest of the brain’s structure though.
Very true. It’s not going to be anywhere near as hard as the naysayers claim; but it’s definitely much harder than we’re capable of now.