I know it was the intention, but it doesn’t actually work the way you think.
The thing that causes the confusion is that you introduced an infallible decision maker into the brain that takes all autonomy away from the human (in case of there being no forecaster). This is basically a logical impossibility, which is why I just said “this is newcomb’s problem”. There has to be a forecaster. But okay, suppose not. I’ll show you why this does make a difference.
In Newcomb’s problem, you do in fact influence the contents of the opaque box. Your decision doesn’t, but the fact that you are the kind of person who makes this decision does. Your algorithm does. In the Alien Implant scenario case no forecaster, you don’t affect the state of your box at all.
If there was a forecaster, you could prevent people from dying of cancer by telling them about Timeless Decision Theory. Their choice not to smoke wouldn’t affect the state of their box, but the fact that you convince them would: the forecaster predicts that you prevent them from smoking, therefore they do not smoke, therefore it predicts they don’t smoke, therefore the box is on state 2.
If there was no forecaster, whether or not you smoke has no effect on your box, causally or otherwise. The state of their box is already determined; if you convinced them not to smoke, they would still get cancer and die and the box would be on state 1. Now this never happens in your scenario, which like I said is pretty close to being impossible, hence the confusion.
But it doesn’t matter! Not smoking means you live, smoking means you die!
No, it doesn’t. Suppose the decision maker was infallible. Everyone who smokes dies. Sooner or later people would all stop smoking. And this is where the scenario doesn’t work anymore. Because the number of people dying can’t go down. So either it must be impossible to convince people – in that case, why try? – or the decision maker becomes fallible, in which case your whole argument breaks apart. You don’t smoke and still die.
Think about this fact again: no forecaster means there is a fixed percentage of the population who has their box on state 1. If you are still not convinced, consider that “winning” by not smoking would then have to mean that someone else gets cancer instead, since you cannot change the number of people. Obviously, this is not what happens.
If there was a forecaster and everyone stopped smoking, no-one would die. If everyone one-boxes in Newcomb’s problem, everyone gets rich.
I’m not sure what you mean by “autonomy” here. The scientists guess that the device is reading or writing, but a third possibility is that it is doing both, and is a kind of brain computer interface. In essence you might as well say it is part of the thing there: the human-black box combination has just as much autonomy as normal humans have.
“Suppose the decision maker was infallible. Everyone who smokes dies. Sooner or later people would all stop smoking. And this is where the scenario doesn’t work anymore. Because the number of people dying can’t go down. So either it must be impossible to convince people – in that case, why try? – or the decision maker becomes fallible, in which case your whole argument breaks apart. You don’t smoke and still die.”
In real life it does seem impossible to convince people; there are plenty of people who are stubborn two-boxers, and plenty of people stubbornly insisting on smoking in the smoking lesion, like yourself. So nothing in my experience rules out it being impossible to convince everyone because of the box. Nonetheless, if the box is writing people’s choices, that does not mean it will permanently be impossible to persuade people. It will be impossible to persuade people who already have the opposite written; but if we succeed in the future in persuading everyone, it will mean that everyone in the future had their dial set to the second position. Nothing says that the proportion of people in the population with the dial set one way or another can’t change; it may be being beamed in by aliens, and perhaps you are cooperating with them by trying to persuade people.
“Think about this fact again: no forecaster means there is a fixed percentage of the population who has their box on state 1.”
So what. The proportion of the population who will in real life die of cancer is a fixed proportion; everyone who is going to die of cancer, is going to die of it, and everyone who isn’t, isn’t. That doesn’t mean the proportion can’t change in the future, either in the real life case, or in the box case.
it will mean that everyone in the future had their dial set to the second position.
No it won’t. Nothing you wrote into the story indicates that you can change the box (in case of no forecaster). If you could, that would change everything (and it wouldn’t be the smoking lesion anymore).
Consider Newcomb’s problem by itself. Omega has already flown away. The million is either there, or it is not.
The only sense that you can change whether the million is there is this: if you decide to take two boxes, you are basically deciding to have been a person who would take two boxes, and therefore deciding that Omega would have not put the million. If you decide to take one box, you are basically deciding to have been a person who would take one box, and therefore deciding that Omega would have put the million there.
In my situation, it is the same: you can “determine” whether your dial is set to the first or second position by making a decision about whether to smoke.
Now consider the Omega situation above, except that after Omega has left, Super-Omega steps in, who cannot be predicted by Omega. Super-Omega changes your decision to the opposite of what it was going to be. If this happens, you can two-box and still get the million, or one-box and get nothing, depending on what your original decision was.
In my situation, it is the same: if someone can actually persuade someone to do the opposite of his dial setting, that persuader is basically like Super-Omega here. In other words, this would be exactly what you were talking about, the situation where convincing someone does not help.
What I was saying was this: in the Alien Implant world, the currently existing people have their dials set to the first or second position in a certain proportion. Let’s say that 90% of people have their dials set to the second position (so that most people don’t die of cancer), and 10% have their dials set to the first position. I agree that the story says their dials never change place. But new people are constantly being born, and nothing in the story says that the proportion among the new people cannot be different.
Assuming the non-existence of Super-Omegas, it is true that the proportion of people who choose to smoke will never be different from the proportion of people who have dials set to the first position. That does not mean that you cannot convince an individual not to smoke—it just means that the person you convince already has his dial set to the second position. And it also does not mean that the proportion cannot change, via the existence of new people.
Also, I forgot to remark on your claim that a non-forecasting box is “logically impossible”. Is this supposed to be logically impossible with a 51% correlation?
or a 52% correlation?
…
or a 98% correlation?
or a 99% correlation?
or a 99.999999% correlation?
I suppose you will say that it becomes logically impossible at an 87.636783% correlation, but I would like to see your argument for this.
In my situation, it is the same: you can “determine” whether your dial is set to the first or second position by making a decision about whether to smoke.
No.
You can not. You can’t.
I’m struggling with this reply. I almost decided to stop trying to convince you. I will try one more time, but I need you to consider the possibility that you are wrong before you continue to the next paragraph. Consider the outside view: if you were right, Yudkowksy would be wrong, Anna would be wrong, everyone who read your post here and didn’t upvote this revolutionary, shocking insight would be wrong. Are you sufficiently more intelligent than any of them to be confident in your conclusion? I’m saying this only so you to consider the possibility, nothing more.
You do not have an impact. The reason why you believe otherwise is probably that in Newcomb’s problem, you do have an impact in an unintuitive way, and you generalized this without fully understanding why you have an impact in Newcomb’s problem. It is not because you can magically choose to live in a certain world despite no causal connection.
In Newcomb’s problem, the kind of person you are causally determines the contents of the opaque box, and it causally determines your decision to open them. You have the option to change the kind of person you are, i.e. decide you’ll one-box in Newcomb’s problem at any given moment before you are confronted with it (such as right now), therefore you causally determine how much money you will receive once you play it in the future. The intuitive argument “it is already decided, therefore it doesn’t matter what I do” is actually 100% correct. Your choice to one-box or two-box has no influence on the contents of the opaque box. But the fact that you are the kind of person who one-boxes does, and it happens to be that you (supposedly) can’t two-box without being the kind of person who two-boxes.
In the Smoking Lesion, in your alien scenario, this impact is not there. An independent source determines both the state of your box and your decision to smoke or not to smoke. A snapshot at all humans at any given time, with no forecasting ability, reveals exactly who will die of cancer and who won’t. If superomega comes from the sky and convinces everyone to stop smoking, the exact same people will die as before. If everyone stopped smoking immediately, the exact same people will die as before. In the future, the exact same people who would otherwise have died still die. People with the box on the wrong state who decide to stop smoking still die.
Consider the outside view: if you were right, Yudkowksy would be wrong, Anna would be wrong, everyone who read your post here and didn’t upvote this revolutionary, shocking insight would be wrong. Are you sufficiently more intelligent than any of them to be confident in your conclusion?
This outside view is too limited; there are plenty of extremely intelligent people outside Less Wrong circles who agree with me. This is why I said from the beginning that the common view here came from the desire to agree with Eliezer. Notice that no one would agree and upvote without first having to disagree with all those others, and they are unlikely to do that because they have the limited outside view you mention here: they would not trust themselves to agree with me, even if it was objectively convincing.
Scott Alexander is probably one of the most unbiased people ever to be involved with Less Wrong. Look at this comment:
But keeping the original premise that it’s known that out of everyone who’s ever lived in all of history, every single virtuous Calvinist has ended up in Heaven and every single sinful Calvinist end has ended up damned—I still choose to be a virtuous Calvinist. And if the decision theorists don’t like that, they can go to hell.
Likewise, if they don’t like not smoking in the situation here, they can die of cancer.
“You have the option to change the kind of person you are, i.e. decide you’ll one-box in Newcomb’s problem at any given moment before you are confronted with it (such as right now), therefore you causally determine how much money you will receive once you play it in the future.”
If I am not the kind of person who would accept this reasoning, I can no more make myself into the kind of person who would accept this reasoning (even right now), than I can make myself into a person who has the dial set to the second position. Both are facts about the world: whether you have the dial set in a certain position, and whether you are the kind of person who could accept that reasoning.
And on the other hand, I can accept the reasoning, and I can choose not to smoke: I will equally be the kind of person who takes one box, and I will be a person who would have the dial in the second position.
I know it was the intention, but it doesn’t actually work the way you think.
The thing that causes the confusion is that you introduced an infallible decision maker into the brain that takes all autonomy away from the human (in case of there being no forecaster). This is basically a logical impossibility, which is why I just said “this is newcomb’s problem”. There has to be a forecaster. But okay, suppose not. I’ll show you why this does make a difference.
In Newcomb’s problem, you do in fact influence the contents of the opaque box. Your decision doesn’t, but the fact that you are the kind of person who makes this decision does. Your algorithm does. In the Alien Implant scenario case no forecaster, you don’t affect the state of your box at all.
If there was a forecaster, you could prevent people from dying of cancer by telling them about Timeless Decision Theory. Their choice not to smoke wouldn’t affect the state of their box, but the fact that you convince them would: the forecaster predicts that you prevent them from smoking, therefore they do not smoke, therefore it predicts they don’t smoke, therefore the box is on state 2.
If there was no forecaster, whether or not you smoke has no effect on your box, causally or otherwise. The state of their box is already determined; if you convinced them not to smoke, they would still get cancer and die and the box would be on state 1. Now this never happens in your scenario, which like I said is pretty close to being impossible, hence the confusion.
But it doesn’t matter! Not smoking means you live, smoking means you die!
No, it doesn’t. Suppose the decision maker was infallible. Everyone who smokes dies. Sooner or later people would all stop smoking. And this is where the scenario doesn’t work anymore. Because the number of people dying can’t go down. So either it must be impossible to convince people – in that case, why try? – or the decision maker becomes fallible, in which case your whole argument breaks apart. You don’t smoke and still die.
Think about this fact again: no forecaster means there is a fixed percentage of the population who has their box on state 1. If you are still not convinced, consider that “winning” by not smoking would then have to mean that someone else gets cancer instead, since you cannot change the number of people. Obviously, this is not what happens.
If there was a forecaster and everyone stopped smoking, no-one would die. If everyone one-boxes in Newcomb’s problem, everyone gets rich.
I’m not sure what you mean by “autonomy” here. The scientists guess that the device is reading or writing, but a third possibility is that it is doing both, and is a kind of brain computer interface. In essence you might as well say it is part of the thing there: the human-black box combination has just as much autonomy as normal humans have.
“Suppose the decision maker was infallible. Everyone who smokes dies. Sooner or later people would all stop smoking. And this is where the scenario doesn’t work anymore. Because the number of people dying can’t go down. So either it must be impossible to convince people – in that case, why try? – or the decision maker becomes fallible, in which case your whole argument breaks apart. You don’t smoke and still die.”
In real life it does seem impossible to convince people; there are plenty of people who are stubborn two-boxers, and plenty of people stubbornly insisting on smoking in the smoking lesion, like yourself. So nothing in my experience rules out it being impossible to convince everyone because of the box. Nonetheless, if the box is writing people’s choices, that does not mean it will permanently be impossible to persuade people. It will be impossible to persuade people who already have the opposite written; but if we succeed in the future in persuading everyone, it will mean that everyone in the future had their dial set to the second position. Nothing says that the proportion of people in the population with the dial set one way or another can’t change; it may be being beamed in by aliens, and perhaps you are cooperating with them by trying to persuade people.
“Think about this fact again: no forecaster means there is a fixed percentage of the population who has their box on state 1.”
So what. The proportion of the population who will in real life die of cancer is a fixed proportion; everyone who is going to die of cancer, is going to die of it, and everyone who isn’t, isn’t. That doesn’t mean the proportion can’t change in the future, either in the real life case, or in the box case.
No it won’t. Nothing you wrote into the story indicates that you can change the box (in case of no forecaster). If you could, that would change everything (and it wouldn’t be the smoking lesion anymore).
I don’t think you understood.
Consider Newcomb’s problem by itself. Omega has already flown away. The million is either there, or it is not.
The only sense that you can change whether the million is there is this: if you decide to take two boxes, you are basically deciding to have been a person who would take two boxes, and therefore deciding that Omega would have not put the million. If you decide to take one box, you are basically deciding to have been a person who would take one box, and therefore deciding that Omega would have put the million there.
In my situation, it is the same: you can “determine” whether your dial is set to the first or second position by making a decision about whether to smoke.
Now consider the Omega situation above, except that after Omega has left, Super-Omega steps in, who cannot be predicted by Omega. Super-Omega changes your decision to the opposite of what it was going to be. If this happens, you can two-box and still get the million, or one-box and get nothing, depending on what your original decision was.
In my situation, it is the same: if someone can actually persuade someone to do the opposite of his dial setting, that persuader is basically like Super-Omega here. In other words, this would be exactly what you were talking about, the situation where convincing someone does not help.
What I was saying was this: in the Alien Implant world, the currently existing people have their dials set to the first or second position in a certain proportion. Let’s say that 90% of people have their dials set to the second position (so that most people don’t die of cancer), and 10% have their dials set to the first position. I agree that the story says their dials never change place. But new people are constantly being born, and nothing in the story says that the proportion among the new people cannot be different.
Assuming the non-existence of Super-Omegas, it is true that the proportion of people who choose to smoke will never be different from the proportion of people who have dials set to the first position. That does not mean that you cannot convince an individual not to smoke—it just means that the person you convince already has his dial set to the second position. And it also does not mean that the proportion cannot change, via the existence of new people.
Also, I forgot to remark on your claim that a non-forecasting box is “logically impossible”. Is this supposed to be logically impossible with a 51% correlation?
or a 52% correlation? … or a 98% correlation? or a 99% correlation? or a 99.999999% correlation?
I suppose you will say that it becomes logically impossible at an 87.636783% correlation, but I would like to see your argument for this.
No.
You can not. You can’t.
I’m struggling with this reply. I almost decided to stop trying to convince you. I will try one more time, but I need you to consider the possibility that you are wrong before you continue to the next paragraph. Consider the outside view: if you were right, Yudkowksy would be wrong, Anna would be wrong, everyone who read your post here and didn’t upvote this revolutionary, shocking insight would be wrong. Are you sufficiently more intelligent than any of them to be confident in your conclusion? I’m saying this only so you to consider the possibility, nothing more.
You do not have an impact. The reason why you believe otherwise is probably that in Newcomb’s problem, you do have an impact in an unintuitive way, and you generalized this without fully understanding why you have an impact in Newcomb’s problem. It is not because you can magically choose to live in a certain world despite no causal connection.
In Newcomb’s problem, the kind of person you are causally determines the contents of the opaque box, and it causally determines your decision to open them. You have the option to change the kind of person you are, i.e. decide you’ll one-box in Newcomb’s problem at any given moment before you are confronted with it (such as right now), therefore you causally determine how much money you will receive once you play it in the future. The intuitive argument “it is already decided, therefore it doesn’t matter what I do” is actually 100% correct. Your choice to one-box or two-box has no influence on the contents of the opaque box. But the fact that you are the kind of person who one-boxes does, and it happens to be that you (supposedly) can’t two-box without being the kind of person who two-boxes.
In the Smoking Lesion, in your alien scenario, this impact is not there. An independent source determines both the state of your box and your decision to smoke or not to smoke. A snapshot at all humans at any given time, with no forecasting ability, reveals exactly who will die of cancer and who won’t. If superomega comes from the sky and convinces everyone to stop smoking, the exact same people will die as before. If everyone stopped smoking immediately, the exact same people will die as before. In the future, the exact same people who would otherwise have died still die. People with the box on the wrong state who decide to stop smoking still die.
Also, about this:
This outside view is too limited; there are plenty of extremely intelligent people outside Less Wrong circles who agree with me. This is why I said from the beginning that the common view here came from the desire to agree with Eliezer. Notice that no one would agree and upvote without first having to disagree with all those others, and they are unlikely to do that because they have the limited outside view you mention here: they would not trust themselves to agree with me, even if it was objectively convincing.
Scott Alexander is probably one of the most unbiased people ever to be involved with Less Wrong. Look at this comment:
Likewise, if they don’t like not smoking in the situation here, they can die of cancer.
“You have the option to change the kind of person you are, i.e. decide you’ll one-box in Newcomb’s problem at any given moment before you are confronted with it (such as right now), therefore you causally determine how much money you will receive once you play it in the future.”
If I am not the kind of person who would accept this reasoning, I can no more make myself into the kind of person who would accept this reasoning (even right now), than I can make myself into a person who has the dial set to the second position. Both are facts about the world: whether you have the dial set in a certain position, and whether you are the kind of person who could accept that reasoning.
And on the other hand, I can accept the reasoning, and I can choose not to smoke: I will equally be the kind of person who takes one box, and I will be a person who would have the dial in the second position.