From the comments, PZ elaborates:
“Andrew G: No, you don’t understand. Part of this magical “scan” has to include vast amounts of data on the physics of the entity…pieces which will interact in complex ways with each other and the environment. Unless you’re also planning to build a vastly sped up model of the whole universe, you’re going to have a simulation of brain running very fast in a sensory deprivation tank.
Or do you really think you can understand how the brain works in complete isolation from physiology, endocrinology, and sensation?”
Seems like PZ is dismissing the feasibility of computation by assuming that computation has to be perfectly literal. To make a chemistry analogy here, one does not have to model the quantum mechanics and the dynamics of every single molecule in a beaker of water in order to simulate the kinetics of a reaction in water. One does not need to replicate the chemical entirety of the neuron in silico; one merely needs to replicate the neuron’s stimulus-response patterns.
Oops, didn’t see a further comment below: In response to a comment, ” I still don’t understand why biologists insist that you have to do a perfect simulation, down to the smallest molecule, and then state the obvious fact that it’s not going to happen.”, PZ says this:
“Errm, because that’s what the singularitarians we’re critiquing are proposing? This whole slice-and-scan proposal is all about recreating the physical components of the brain in a virtual space, without bothering to understand how those components work. We’re telling you that approach requires an awfully fine-grained simulation.
An alternative would be to, for instance, break down the brain into components, figure out what the inputs and outputs to, say, the nucleus accumbens are, and then model how that tissue processes it all (that approach is being taken with models of portions of the hippocampus). That approach doesn’t require a detailed knowledge of what every molecule in the tissue is doing.
But the method described here is a brute force dismantling and reconstruction of every cell in the brain. That requires details of every molecule.”
Well, there are many different possible levels of brain emulation (just like in emulating video game consoles), all of which have different demands and feasibilities. The Whole Brain Emulation roadmap discusses several.
No one denies that details of every molecule would be a very brute force and difficult emulation and as far as that goes, he’s not strawmanning; but to think that this is the only kind of emulation and dismiss emulation in general on the basis of the specific, that is a straw man.
In the first quote, he sets up the straw man as gwern describes it. In the second quote, he defends his first straw man by saying “but that’s what singularitarians believe”, essentially putting up a second straw man to defend the first.
The quote jumps between models of large brain regions to molecule by molecule analysis, leaving out the intermediate of creating models of neurons. Thus all the talk in the roadmap about predictive models.
From the comments, PZ elaborates: “Andrew G: No, you don’t understand. Part of this magical “scan” has to include vast amounts of data on the physics of the entity…pieces which will interact in complex ways with each other and the environment. Unless you’re also planning to build a vastly sped up model of the whole universe, you’re going to have a simulation of brain running very fast in a sensory deprivation tank.
Or do you really think you can understand how the brain works in complete isolation from physiology, endocrinology, and sensation?”
Seems like PZ is dismissing the feasibility of computation by assuming that computation has to be perfectly literal. To make a chemistry analogy here, one does not have to model the quantum mechanics and the dynamics of every single molecule in a beaker of water in order to simulate the kinetics of a reaction in water. One does not need to replicate the chemical entirety of the neuron in silico; one merely needs to replicate the neuron’s stimulus-response patterns.
Oops, didn’t see a further comment below: In response to a comment, ” I still don’t understand why biologists insist that you have to do a perfect simulation, down to the smallest molecule, and then state the obvious fact that it’s not going to happen.”, PZ says this:
“Errm, because that’s what the singularitarians we’re critiquing are proposing? This whole slice-and-scan proposal is all about recreating the physical components of the brain in a virtual space, without bothering to understand how those components work. We’re telling you that approach requires an awfully fine-grained simulation.
An alternative would be to, for instance, break down the brain into components, figure out what the inputs and outputs to, say, the nucleus accumbens are, and then model how that tissue processes it all (that approach is being taken with models of portions of the hippocampus). That approach doesn’t require a detailed knowledge of what every molecule in the tissue is doing.
But the method described here is a brute force dismantling and reconstruction of every cell in the brain. That requires details of every molecule.”
Still seems like a straw man.
Erm, please clarify how.
Well, there are many different possible levels of brain emulation (just like in emulating video game consoles), all of which have different demands and feasibilities. The Whole Brain Emulation roadmap discusses several.
No one denies that details of every molecule would be a very brute force and difficult emulation and as far as that goes, he’s not strawmanning; but to think that this is the only kind of emulation and dismiss emulation in general on the basis of the specific, that is a straw man.
In the first quote, he sets up the straw man as gwern describes it. In the second quote, he defends his first straw man by saying “but that’s what singularitarians believe”, essentially putting up a second straw man to defend the first.
The quote jumps between models of large brain regions to molecule by molecule analysis, leaving out the intermediate of creating models of neurons. Thus all the talk in the roadmap about predictive models.