I understand the point Eliezer’s trying to make here. However, you (whoever’s reading this) could not convince me that ss0 + ss0 =sss0 in Peano arithmetic (I define the scenario in which my mind is directly manipulated so that I happen to believe this not to constitute “convincing me”). Here’s why I believe this position to be rational:
A)In order for me to make this argument, I have to presume communication of it. It’s not that I believe the probability of that communication to be 1. Certainly many people might read this comment and not know Peano arithmetic, misunderstand my language, not finish reading, etc. etc. etc. and the probability of this is nontrivial. However, arguments are directed at the possible worlds in which they are understood.
B)Communication of “ss0 + ss0 =” as a statement of Peano arithmetic already fully constrains the answer to be “ssss0“ simply by virtue of what these symbols mean. That is to say, that having understood these symbols and Peano arithmetic, no further experience is necessary to know that “sss0” is wrong. Mental flaws at any point in this process or understanding are possible, but they exist only within possible worlds in which communication of these ideas does not actually occur because to think that “ss0 + ss0 = sss0” is to misunderstand Peano arithmetic, and understanding Peano arithmetic is a prerequisite for understanding a claim about it.
Therefore
C)There is no possible world within which I can be convinced of the properly communicated concept “ss0 + ss0 = sss0”. Of course, this doesn’t mean there’s no possible world in which I can be convinced that I am experiencing a neurological fault or being manipulated, or that there are no possible worlds in which I happen to wrongly believe that ss0 + ss0 = sss0. It’s just that someone experiencing a neurological fault or being manipulated is not the same thing as someone being convinced.
A similar argument holds for the impossibility of me convincing myself that ss0 + ss0 = sss0. I understand ss0 + ss0 = ssss0 in Peano arithmetic well enough that I can review in a very short period of time why it must be so. Thus you would literally have to make me forget that I know this in order to have me believe otherwise, which hardly counts as “convincing.” This does not mean that I am presuming mental errors or Dark Lords of the Matrix to be impossible. For clarification, here’s what a runthrough of me experiencing what Eliezer proposes would look like:
I get up one morning, take out two earplugs, and set them down next to two other earplugs on my nighttable, and noticed that there were now three earplugs, without any earplugs having appeared or disappeared—in contrast to my stored memory that 2 + 2 was supposed to equal 4.
Because that stored memory entails an understanding of why, I run through those reasons. If they’re incomplete this constitutes me “forgetting that I know this.” (It does not mean that I don’t know this now. Right now I do.) Therefore I don’t have a “stored memory that 2 + 2 was supposed to equal 4.” I have an incomplete stored memory which tries to say something about 2, +, =, and 4 (if my personality were intact I would probably try and re-derive the missing parts of it, after calling 911). Either way I identify a cognitive fault. In real life waking up to this my most likely suspect would be that my experience of one earplug disappearing was deleted before I processed it, but there are lots of other possibilities as well. If I repeated the experiment multiple times I would consider either a systematic fault or “being messed with” at a fundamental level.
When I visualize the process in my own mind, it seems that making XX and XX come out to XXXX requires an extra X to appear from nowhere
Still presuming an intact line of reasoning saying why this must not be so, I would again identify a cognitive fault, and a pretty cool one at that. Something this intricate might well leave me suspecting Dark Lords of the Matrix as a nontrivial possibility, provided all other cognitive functions seemed fully intact. Still wouldn’t be as likely as a weird brain fault, though. I would definitely have fun investigating this.
I check a pocket calculator, Google, and my copy of 1984 where Winston writes that “Freedom is the freedom to say two plus two equals three.”
Dark Lords of the Matrix bump higher, but Psychosis has definitely leapt into the front of the pack.
I could keep going, of course. These last few presume I can still reason out something like Peano arithmetic. If I can’t incidentally, then of course they look different but I still don’t think it would be accurate to describe any possible outcome as “me being convinced that 2 + 2 = 3.” If you run all the way down the list until you literally delete all things that I know and all ways I might obtain them, I would describe that as a possible universe in which “me” has been deleted. The strict lower bound on where I can still stumble across my cognitive fault and/or manipulation is when my reasoning ability is no longer Turing complete. This essentially requires the elimination of all complex thought, though of course making it merely unlikely for me to stumble upon the fault is much easier—just delete everything I know about formal mathematics, for example.
tl;dr
I agree with most of what Eliezer is saying, but wouldn’t say that I could be convinced 2 + 2 = 3. Does this make my belief unconditional? Dependant on me understanding what 2 + 2 = 3 means, maybe it does. Maybe an understanding of 2, +, =, 3, and 4 necessitates 2 + 2 = 4 for a rational mind, and any deviation from this, even in internal mental processes, would be identifiable as a fault. After all, you can run a software program to detect flaws in some computer processors.
Extrapolating from Eliezers line of reasoning you would probably find that although you remember ss0 + ss0 = ssss0, if you try to derive ss0 + ss0 from the peano axioms, you also discover it ends up as sss0, and starting with ss0 + ss0 = ssss0 quickly leads you to a contradiction.
I understand the point Eliezer’s trying to make here. However, you (whoever’s reading this) could not convince me that ss0 + ss0 =sss0 in Peano arithmetic (I define the scenario in which my mind is directly manipulated so that I happen to believe this not to constitute “convincing me”). Here’s why I believe this position to be rational:
A)In order for me to make this argument, I have to presume communication of it. It’s not that I believe the probability of that communication to be 1. Certainly many people might read this comment and not know Peano arithmetic, misunderstand my language, not finish reading, etc. etc. etc. and the probability of this is nontrivial. However, arguments are directed at the possible worlds in which they are understood.
B)Communication of “ss0 + ss0 =” as a statement of Peano arithmetic already fully constrains the answer to be “ssss0“ simply by virtue of what these symbols mean. That is to say, that having understood these symbols and Peano arithmetic, no further experience is necessary to know that “sss0” is wrong. Mental flaws at any point in this process or understanding are possible, but they exist only within possible worlds in which communication of these ideas does not actually occur because to think that “ss0 + ss0 = sss0” is to misunderstand Peano arithmetic, and understanding Peano arithmetic is a prerequisite for understanding a claim about it.
Therefore
C)There is no possible world within which I can be convinced of the properly communicated concept “ss0 + ss0 = sss0”. Of course, this doesn’t mean there’s no possible world in which I can be convinced that I am experiencing a neurological fault or being manipulated, or that there are no possible worlds in which I happen to wrongly believe that ss0 + ss0 = sss0. It’s just that someone experiencing a neurological fault or being manipulated is not the same thing as someone being convinced.
A similar argument holds for the impossibility of me convincing myself that ss0 + ss0 = sss0. I understand ss0 + ss0 = ssss0 in Peano arithmetic well enough that I can review in a very short period of time why it must be so. Thus you would literally have to make me forget that I know this in order to have me believe otherwise, which hardly counts as “convincing.” This does not mean that I am presuming mental errors or Dark Lords of the Matrix to be impossible. For clarification, here’s what a runthrough of me experiencing what Eliezer proposes would look like:
Because that stored memory entails an understanding of why, I run through those reasons. If they’re incomplete this constitutes me “forgetting that I know this.” (It does not mean that I don’t know this now. Right now I do.) Therefore I don’t have a “stored memory that 2 + 2 was supposed to equal 4.” I have an incomplete stored memory which tries to say something about 2, +, =, and 4 (if my personality were intact I would probably try and re-derive the missing parts of it, after calling 911). Either way I identify a cognitive fault. In real life waking up to this my most likely suspect would be that my experience of one earplug disappearing was deleted before I processed it, but there are lots of other possibilities as well. If I repeated the experiment multiple times I would consider either a systematic fault or “being messed with” at a fundamental level.
Still presuming an intact line of reasoning saying why this must not be so, I would again identify a cognitive fault, and a pretty cool one at that. Something this intricate might well leave me suspecting Dark Lords of the Matrix as a nontrivial possibility, provided all other cognitive functions seemed fully intact. Still wouldn’t be as likely as a weird brain fault, though. I would definitely have fun investigating this.
Dark Lords of the Matrix bump higher, but Psychosis has definitely leapt into the front of the pack.
I could keep going, of course. These last few presume I can still reason out something like Peano arithmetic. If I can’t incidentally, then of course they look different but I still don’t think it would be accurate to describe any possible outcome as “me being convinced that 2 + 2 = 3.” If you run all the way down the list until you literally delete all things that I know and all ways I might obtain them, I would describe that as a possible universe in which “me” has been deleted. The strict lower bound on where I can still stumble across my cognitive fault and/or manipulation is when my reasoning ability is no longer Turing complete. This essentially requires the elimination of all complex thought, though of course making it merely unlikely for me to stumble upon the fault is much easier—just delete everything I know about formal mathematics, for example.
tl;dr
I agree with most of what Eliezer is saying, but wouldn’t say that I could be convinced 2 + 2 = 3. Does this make my belief unconditional? Dependant on me understanding what 2 + 2 = 3 means, maybe it does. Maybe an understanding of 2, +, =, 3, and 4 necessitates 2 + 2 = 4 for a rational mind, and any deviation from this, even in internal mental processes, would be identifiable as a fault. After all, you can run a software program to detect flaws in some computer processors.
Extrapolating from Eliezers line of reasoning you would probably find that although you remember ss0 + ss0 = ssss0, if you try to derive ss0 + ss0 from the peano axioms, you also discover it ends up as sss0, and starting with ss0 + ss0 = ssss0 quickly leads you to a contradiction.