Curated. I like “think clearly about confusing philosophical” topics and this post is a very well-written explainer. I think it’s likely both correct and ought to be convincing. At the same time, I think it’s somewhat incomplete and that a fuller version would make the case for why we should wholly believe that experience is determined by local brain state rather than treat it as an assumption. Beyond that, I think that the most convincing explainer on this topic would build on a fully satisfactory theory of consciousness that totally answers the hard problem and makes none of it feel mysterious at all. Still, glad to see this on LessWrong, taking us kind of bag to our roots.
As TAG has written a number of times, the computationalist thesis seems not to have been convincingly (or even concretely) argued for in any LessWrong post or sequence (including Eliezer’s Sequences). What has been argued for, over and over again, is physicalism, and then more and more rejections of dualist conceptions of souls.
That’s perfectly fine, but “souls don’t exist and thus consciousness and identity must function on top of a physical substrate” is very different from “the identity of a being is given by the abstract classical computation performed by a particular (and reified) subset of the brain’s electronic circuit,” and the latter has never been given compelling explanations or evidence. [1] This is despite the fact that the particular conclusions that have become part of the ethos of LW about stuff like brain emulation, cryonics etc are necessarily reliant on the latter, not the former.
As a general matter, accepting physicalism as correct would naturally lead one to the conclusion that what runs on top of the physical substrate works on the basis of… what is physically there (which, to the best of our current understanding, can be represented through Quantum Mechanical probability amplitudes), not what conclusions you draw from a mathematical model that abstracts away quantum randomness in favor of a classical picture, the entire brain structure in favor of (a slightly augmented version of) its connectome, and the entire chemical make-up of it in favor of its electrical connections. As I have mentioned, that is a mere model that represents a very lossy compression of what is going on; it is not the same as the real thing, and conflating the two is an error that has been going on here for far too long. Of course, it very well might be the case that Rob and the computationalists are right about these issues, but the explanation up to now should make it clear why it is on them to provide evidence for their conclusion.
More specifically, is a real-world being actually the same as the abstract computation its mind embodies? Rejections of souls and dualism, alongside arguments for physicalism, do not prove the computationalist thesis to be correct, as physicalism-without-computationalism is not only possible but also (as the very name implies) a priori far more faithful to the standard physicalist worldview.
[...]
The feedback loops implicit in the structure of the brain cause reward and punishment signals to “release chemicals that induce the brain to rearrange itself” in a manner closely analogous to and clearly reminiscent of a continuous and (until death) never-ending micro-scale brain surgery. To be sure, barring serious brain trauma, these are typically small-scale changes, but they nevertheless fundamentally modify the connections in the brain and thus the computation it would produce in something like an emulated state (as a straightforward corollary, how would an em that does not “update” its brain chemistry the same way that a biological being does be “human” in any decision-relevant way?). We can think about a continuous personal identity through the lens of mutual information about memories, personalities etc, but our current understanding of these topics is vastly incomplete and inadequate, and in any case the naive (yet very widespread, even on LW) interpretation of “the utility function is not up for grabs” as meaning that terminal values cannot be changed (or even make sense as a coherent concept) seems totally wrong.
It is hard to prove a negative, but I make this claim on the basis of having read ~ the entirety of what has been written on this site about personal identity and consciousness.
I differ from Rob in that I do think his piece should have flagged the assumption of ~computationalism, but think the assumption is reasonable enough to not have argued for in this piece.
I do think it is interesting philosophical discussion to hash it out, for the sake of rigor and really pushing for clarity. I’m sad that I don’t think I could dive in deep on the topic right now.
To answer your question in your other comment. I reckon with some time I could write an explainer for why we should very reasonable assume consciousness is the result of local brain stuff and nothing else (and also not quantum stuff), though I’d be surprised if I could easily write something so rigorous that you’d find it fully satisfactory.
a fuller version would make the case for why we should wholly believe that experience is determined by local brain state rather than treat it as an assumption
Relatedly to my other comment, I’m curious if you (Ruby) think you would be capable of writing such a version. I’m obviously not asking you to actually write it (you probably have much better things to do with your time), but I do wonder what the answer to this question is, and if it is “no”, then I would also want to ask whether you nonetheless think you have good reasons to believe that Rob’s conclusion is correct (in light of the counterarguments and reasons for skepticism that have been brought up, and which seem to me like they would necessitate that exact “fuller version” to resolve).
Curated. I like “think clearly about confusing philosophical” topics and this post is a very well-written explainer. I think it’s likely both correct and ought to be convincing. At the same time, I think it’s somewhat incomplete and that a fuller version would make the case for why we should wholly believe that experience is determined by local brain state rather than treat it as an assumption. Beyond that, I think that the most convincing explainer on this topic would build on a fully satisfactory theory of consciousness that totally answers the hard problem and makes none of it feel mysterious at all. Still, glad to see this on LessWrong, taking us kind of bag to our roots.
I find this rather difficult to believe in light of andesoldes’s excellent distillation of Rob’s position and subsequent detailed and concrete explanation of why it seems wrong to have this degree of confidence in his beliefs.
As TAG has written a number of times, the computationalist thesis seems not to have been convincingly (or even concretely) argued for in any LessWrong post or sequence (including Eliezer’s Sequences). What has been argued for, over and over again, is physicalism, and then more and more rejections of dualist conceptions of souls.
That’s perfectly fine, but “souls don’t exist and thus consciousness and identity must function on top of a physical substrate” is very different from “the identity of a being is given by the abstract classical computation performed by a particular (and reified) subset of the brain’s electronic circuit,” and the latter has never been given compelling explanations or evidence. [1] This is despite the fact that the particular conclusions that have become part of the ethos of LW about stuff like brain emulation, cryonics etc are necessarily reliant on the latter, not the former.
As a general matter, accepting physicalism as correct would naturally lead one to the conclusion that what runs on top of the physical substrate works on the basis of… what is physically there (which, to the best of our current understanding, can be represented through Quantum Mechanical probability amplitudes), not what conclusions you draw from a mathematical model that abstracts away quantum randomness in favor of a classical picture, the entire brain structure in favor of (a slightly augmented version of) its connectome, and the entire chemical make-up of it in favor of its electrical connections. As I have mentioned, that is a mere model that represents a very lossy compression of what is going on; it is not the same as the real thing, and conflating the two is an error that has been going on here for far too long. Of course, it very well might be the case that Rob and the computationalists are right about these issues, but the explanation up to now should make it clear why it is on them to provide evidence for their conclusion.
As I have written before about these matters:
It is hard to prove a negative, but I make this claim on the basis of having read ~ the entirety of what has been written on this site about personal identity and consciousness.
I differ from Rob in that I do think his piece should have flagged the assumption of ~computationalism, but think the assumption is reasonable enough to not have argued for in this piece.
I do think it is interesting philosophical discussion to hash it out, for the sake of rigor and really pushing for clarity. I’m sad that I don’t think I could dive in deep on the topic right now.
To answer your question in your other comment. I reckon with some time I could write an explainer for why we should very reasonable assume consciousness is the result of local brain stuff and nothing else (and also not quantum stuff), though I’d be surprised if I could easily write something so rigorous that you’d find it fully satisfactory.
Relatedly to my other comment, I’m curious if you (Ruby) think you would be capable of writing such a version. I’m obviously not asking you to actually write it (you probably have much better things to do with your time), but I do wonder what the answer to this question is, and if it is “no”, then I would also want to ask whether you nonetheless think you have good reasons to believe that Rob’s conclusion is correct (in light of the counterarguments and reasons for skepticism that have been brought up, and which seem to me like they would necessitate that exact “fuller version” to resolve).