In my situation, it is the same: you can “determine” whether your dial is set to the first or second position by making a decision about whether to smoke.
No.
You can not. You can’t.
I’m struggling with this reply. I almost decided to stop trying to convince you. I will try one more time, but I need you to consider the possibility that you are wrong before you continue to the next paragraph. Consider the outside view: if you were right, Yudkowksy would be wrong, Anna would be wrong, everyone who read your post here and didn’t upvote this revolutionary, shocking insight would be wrong. Are you sufficiently more intelligent than any of them to be confident in your conclusion? I’m saying this only so you to consider the possibility, nothing more.
You do not have an impact. The reason why you believe otherwise is probably that in Newcomb’s problem, you do have an impact in an unintuitive way, and you generalized this without fully understanding why you have an impact in Newcomb’s problem. It is not because you can magically choose to live in a certain world despite no causal connection.
In Newcomb’s problem, the kind of person you are causally determines the contents of the opaque box, and it causally determines your decision to open them. You have the option to change the kind of person you are, i.e. decide you’ll one-box in Newcomb’s problem at any given moment before you are confronted with it (such as right now), therefore you causally determine how much money you will receive once you play it in the future. The intuitive argument “it is already decided, therefore it doesn’t matter what I do” is actually 100% correct. Your choice to one-box or two-box has no influence on the contents of the opaque box. But the fact that you are the kind of person who one-boxes does, and it happens to be that you (supposedly) can’t two-box without being the kind of person who two-boxes.
In the Smoking Lesion, in your alien scenario, this impact is not there. An independent source determines both the state of your box and your decision to smoke or not to smoke. A snapshot at all humans at any given time, with no forecasting ability, reveals exactly who will die of cancer and who won’t. If superomega comes from the sky and convinces everyone to stop smoking, the exact same people will die as before. If everyone stopped smoking immediately, the exact same people will die as before. In the future, the exact same people who would otherwise have died still die. People with the box on the wrong state who decide to stop smoking still die.
Consider the outside view: if you were right, Yudkowksy would be wrong, Anna would be wrong, everyone who read your post here and didn’t upvote this revolutionary, shocking insight would be wrong. Are you sufficiently more intelligent than any of them to be confident in your conclusion?
This outside view is too limited; there are plenty of extremely intelligent people outside Less Wrong circles who agree with me. This is why I said from the beginning that the common view here came from the desire to agree with Eliezer. Notice that no one would agree and upvote without first having to disagree with all those others, and they are unlikely to do that because they have the limited outside view you mention here: they would not trust themselves to agree with me, even if it was objectively convincing.
Scott Alexander is probably one of the most unbiased people ever to be involved with Less Wrong. Look at this comment:
But keeping the original premise that it’s known that out of everyone who’s ever lived in all of history, every single virtuous Calvinist has ended up in Heaven and every single sinful Calvinist end has ended up damned—I still choose to be a virtuous Calvinist. And if the decision theorists don’t like that, they can go to hell.
Likewise, if they don’t like not smoking in the situation here, they can die of cancer.
“You have the option to change the kind of person you are, i.e. decide you’ll one-box in Newcomb’s problem at any given moment before you are confronted with it (such as right now), therefore you causally determine how much money you will receive once you play it in the future.”
If I am not the kind of person who would accept this reasoning, I can no more make myself into the kind of person who would accept this reasoning (even right now), than I can make myself into a person who has the dial set to the second position. Both are facts about the world: whether you have the dial set in a certain position, and whether you are the kind of person who could accept that reasoning.
And on the other hand, I can accept the reasoning, and I can choose not to smoke: I will equally be the kind of person who takes one box, and I will be a person who would have the dial in the second position.
No.
You can not. You can’t.
I’m struggling with this reply. I almost decided to stop trying to convince you. I will try one more time, but I need you to consider the possibility that you are wrong before you continue to the next paragraph. Consider the outside view: if you were right, Yudkowksy would be wrong, Anna would be wrong, everyone who read your post here and didn’t upvote this revolutionary, shocking insight would be wrong. Are you sufficiently more intelligent than any of them to be confident in your conclusion? I’m saying this only so you to consider the possibility, nothing more.
You do not have an impact. The reason why you believe otherwise is probably that in Newcomb’s problem, you do have an impact in an unintuitive way, and you generalized this without fully understanding why you have an impact in Newcomb’s problem. It is not because you can magically choose to live in a certain world despite no causal connection.
In Newcomb’s problem, the kind of person you are causally determines the contents of the opaque box, and it causally determines your decision to open them. You have the option to change the kind of person you are, i.e. decide you’ll one-box in Newcomb’s problem at any given moment before you are confronted with it (such as right now), therefore you causally determine how much money you will receive once you play it in the future. The intuitive argument “it is already decided, therefore it doesn’t matter what I do” is actually 100% correct. Your choice to one-box or two-box has no influence on the contents of the opaque box. But the fact that you are the kind of person who one-boxes does, and it happens to be that you (supposedly) can’t two-box without being the kind of person who two-boxes.
In the Smoking Lesion, in your alien scenario, this impact is not there. An independent source determines both the state of your box and your decision to smoke or not to smoke. A snapshot at all humans at any given time, with no forecasting ability, reveals exactly who will die of cancer and who won’t. If superomega comes from the sky and convinces everyone to stop smoking, the exact same people will die as before. If everyone stopped smoking immediately, the exact same people will die as before. In the future, the exact same people who would otherwise have died still die. People with the box on the wrong state who decide to stop smoking still die.
Also, about this:
This outside view is too limited; there are plenty of extremely intelligent people outside Less Wrong circles who agree with me. This is why I said from the beginning that the common view here came from the desire to agree with Eliezer. Notice that no one would agree and upvote without first having to disagree with all those others, and they are unlikely to do that because they have the limited outside view you mention here: they would not trust themselves to agree with me, even if it was objectively convincing.
Scott Alexander is probably one of the most unbiased people ever to be involved with Less Wrong. Look at this comment:
Likewise, if they don’t like not smoking in the situation here, they can die of cancer.
“You have the option to change the kind of person you are, i.e. decide you’ll one-box in Newcomb’s problem at any given moment before you are confronted with it (such as right now), therefore you causally determine how much money you will receive once you play it in the future.”
If I am not the kind of person who would accept this reasoning, I can no more make myself into the kind of person who would accept this reasoning (even right now), than I can make myself into a person who has the dial set to the second position. Both are facts about the world: whether you have the dial set in a certain position, and whether you are the kind of person who could accept that reasoning.
And on the other hand, I can accept the reasoning, and I can choose not to smoke: I will equally be the kind of person who takes one box, and I will be a person who would have the dial in the second position.