This is a self-deception technique. If you think it’s morally OK to self-deceive your future self for your current selfish ends, then by all means go ahead. Also, it looks like violent means of precommitment should actually be considered immoral, on par with forcing some other person to do your bidding by hiring a killer to kill them if they don’t comply.
In the Newcomb’s problem, it actually is in your self-interest to one-box. Not so in this problem.
I am fairly sure that it isn’t, but demonstrating so would require another maths-laden article, which I anticipate would be received similarly to my last. I will however email you my entire reasoning if you so wish (you will have to wait several days while I brush up on the logical concept of common knowledge). (I don’t know how to encode a ) in a link, so please add one to the end.)
I’m going to write up my new position on this topic. Nonetheless I think it should be possible to discuss the question in a more concise form, since I think the problem is that of communication, not rigor. You deceive your future self, that’s the whole point of the comment above, make it believe that it wants to make an action that it actually doesn’t. The only disagreement position that I expect is saying that no, the future self actually wants to follow that action.
I think the problem with your article wasn’t that it was math-laden, but that you didn’t introduce things in sufficient detail to follow along, and to see the motivation behind the math.
To be perfectly honest, your last sentence is also my feeling. I should at the least have talked more about the key equation. But the article was already long, I was unsure as to how it would be received, and I spent too little time revising it (this is a persistent problem for me). If I were to write it again now, it would have been closer in style to the thread between you and me there.
If you intend to write another post, then I am happy to wait until then to introduce the ideas I have in mind, and I will try hard to do so in a manner that won’t alienate everyone.
This is a self-deception technique. If you think it’s morally OK to self-deceive your future self for your current selfish ends, then by all means go ahead. Also, it looks like violent means of precommitment should actually be considered immoral, on par with forcing some other person to do your bidding by hiring a killer to kill them if they don’t comply.
In the Newcomb’s problem, it actually is in your self-interest to one-box. Not so in this problem.
I am fairly sure that it isn’t, but demonstrating so would require another maths-laden article, which I anticipate would be received similarly to my last. I will however email you my entire reasoning if you so wish (you will have to wait several days while I brush up on the logical concept of common knowledge). (I don’t know how to encode a ) in a link, so please add one to the end.)
Common knowledge (I used the %29 ASCII code for ”)”).
I’m going to write up my new position on this topic. Nonetheless I think it should be possible to discuss the question in a more concise form, since I think the problem is that of communication, not rigor. You deceive your future self, that’s the whole point of the comment above, make it believe that it wants to make an action that it actually doesn’t. The only disagreement position that I expect is saying that no, the future self actually wants to follow that action.
I think the problem with your article wasn’t that it was math-laden, but that you didn’t introduce things in sufficient detail to follow along, and to see the motivation behind the math.
To be perfectly honest, your last sentence is also my feeling. I should at the least have talked more about the key equation. But the article was already long, I was unsure as to how it would be received, and I spent too little time revising it (this is a persistent problem for me). If I were to write it again now, it would have been closer in style to the thread between you and me there.
If you intend to write another post, then I am happy to wait until then to introduce the ideas I have in mind, and I will try hard to do so in a manner that won’t alienate everyone.