Using Newcomb’s problem as an example, it seems like it glosses over important details of how much evidence you would actually need to believe in an Omega like entity and as a result confuses more than it illuminates. Re-reading some of Eliezer’s posts on it I get the impression that he is hinting that his resolution of the issue is connected to that problem. It seems to me that it causes a lot of unnecessary confusion because humans are susceptible to stories that require suspension of disbelief in highly implausible occurrences that they would not actually suspend their disbelief for if encountered in real life. This might be an example of Robin Hanson’s near/far distinction.
Tyler Cowen’s cautionary tale about the dangers of stories covers some of the same kinds of human biases that I think are triggered by implausible hypotheticals.
Using Newcomb’s problem as an example, it seems like it glosses over important details of how much evidence you would actually need to believe in an Omega like entity and as a result confuses more than it illuminates.
It certainly does gloss over that… I mean it has to, you’d require a lot of evidence. But the reason it does so is because the question isn’t could Omega exists or how can we tel when Omega shows up… the details are buried because they aren’t relevant. How does Newcomb’s problem confuse more that illuminate? It illustrates a problem/paradox. We would not be aware of that paradox were it not for the hypothetical. I suppose it confuses in the sense that one becomes aware of a problem they weren’t previously- but that’s the kind of confusion we want.
Tyler Cowen’s cautionary tale about the dangers of stories covers some of the same kinds of human biases that I think are triggered by implausible hypotheticals.
It’s a great video and I’m grateful you linked me to it but I don’t see where the problems with the kind of stories Cowen was discussing show up in thought experiments.
How does Newcomb’s problem confuse more that illuminate? It illustrates a problem/paradox. We would not be aware of that paradox were it not for the hypothetical.
The danger is that you can use a hypothetical to illustrate a paradox that isn’t really a paradox, because its preconditions are impossible. A famous example: Suppose you’re driving a car at the speed of light, and you turn on the headlights. What do you see?
How does Newcomb’s problem confuse more that illuminate? It illustrates a problem/paradox.
It confuses because it doesn’t really show a problem/paradox. That is not obvious because of the peculiar construction of the hypothetical. If you actually had enough evidence to make it seem like one-boxing was the obvious choice then it wouldn’t seem like a paradoxical choice. The problem is people generally aren’t able to imagine themselves into such a scenario and so think they should two-box and then think there is a paradox (because you ‘should’ one-box). They quite reasonably aren’t able to imagine themselves into such a scenario because it is wildly implausible. The paradox is just an artifact of difficulties we have mentally dealing with highly implausible scenarios.
I don’t see where the problems with the kind of stories Cowen was discussing show up in thought experiments.
Specifically what I had in mind was the fact that people seem to have a natural willingness to suspend disbelief and accept contradictory or wildly implausible premises when ‘story mode’ is activated. We are used to listening to stories and we become less critical of logical inconsistencies and unlikely scenarios because they are a staple of stories. Presenting a thought experiment in the form of a story containing a highly implausible scenario takes advantage of a weakness in our mental defenses which exists for story-shaped language and leads to confusion and misjudgement which we would not exhibit if confronted with a real situation rather than a story.
If you actually had enough evidence to make it seem like one-boxing was the obvious choice then it wouldn’t seem like a paradoxical choice. The problem is people generally aren’t able to imagine themselves into such a scenario and so think they should two-box and then think there is a paradox (because you ‘should’ one-box).
No. The choice is paradoxical because no matter how much evidence you have of Omega’s omniscience the choice you make can’t change the amount of money in the box. As such traditional decision theory tells you to two- box because the decision you make can’t affect the amount of money the boxes. No matter how much money is in the boxes you should more by two boxing. Most educated people are causal decision makers by default. So a thought experiment where causal decision makers lose is paradox inducing. If one-boxing was the obvious choice people would feel the need to posit new decision theories as a result.
I disagree, and I think this is what Eliezer is hinting towards now I’ve gone back and re-read Newcomb’s Problem and Regret of Rationality. If you really have had sufficient evidence to believe that Omega is either an omniscient mind reader or some kind of acausal agent such that it makes sense to one-box then it makes sense to one-box. It only look like a paradox because you’re failing to imagine having that much evidence. Which incidentally is not really a problem—an inability to imagine highly implausible scenarios in detail is not generally an actual handicap in real world decision making.
I’m still going to two-box if Omega appears tomorrow though because there are very many more likely explanations for the series of events depicted in the story than the one you are supposed to take as given.
Using Newcomb’s problem as an example, it seems like it glosses over important details of how much evidence you would actually need to believe in an Omega like entity and as a result confuses more than it illuminates. Re-reading some of Eliezer’s posts on it I get the impression that he is hinting that his resolution of the issue is connected to that problem. It seems to me that it causes a lot of unnecessary confusion because humans are susceptible to stories that require suspension of disbelief in highly implausible occurrences that they would not actually suspend their disbelief for if encountered in real life. This might be an example of Robin Hanson’s near/far distinction.
Tyler Cowen’s cautionary tale about the dangers of stories covers some of the same kinds of human biases that I think are triggered by implausible hypotheticals.
It certainly does gloss over that… I mean it has to, you’d require a lot of evidence. But the reason it does so is because the question isn’t could Omega exists or how can we tel when Omega shows up… the details are buried because they aren’t relevant. How does Newcomb’s problem confuse more that illuminate? It illustrates a problem/paradox. We would not be aware of that paradox were it not for the hypothetical. I suppose it confuses in the sense that one becomes aware of a problem they weren’t previously- but that’s the kind of confusion we want.
It’s a great video and I’m grateful you linked me to it but I don’t see where the problems with the kind of stories Cowen was discussing show up in thought experiments.
The danger is that you can use a hypothetical to illustrate a paradox that isn’t really a paradox, because its preconditions are impossible. A famous example: Suppose you’re driving a car at the speed of light, and you turn on the headlights. What do you see?
This is a danger. Good point.
It confuses because it doesn’t really show a problem/paradox. That is not obvious because of the peculiar construction of the hypothetical. If you actually had enough evidence to make it seem like one-boxing was the obvious choice then it wouldn’t seem like a paradoxical choice. The problem is people generally aren’t able to imagine themselves into such a scenario and so think they should two-box and then think there is a paradox (because you ‘should’ one-box). They quite reasonably aren’t able to imagine themselves into such a scenario because it is wildly implausible. The paradox is just an artifact of difficulties we have mentally dealing with highly implausible scenarios.
Specifically what I had in mind was the fact that people seem to have a natural willingness to suspend disbelief and accept contradictory or wildly implausible premises when ‘story mode’ is activated. We are used to listening to stories and we become less critical of logical inconsistencies and unlikely scenarios because they are a staple of stories. Presenting a thought experiment in the form of a story containing a highly implausible scenario takes advantage of a weakness in our mental defenses which exists for story-shaped language and leads to confusion and misjudgement which we would not exhibit if confronted with a real situation rather than a story.
No. The choice is paradoxical because no matter how much evidence you have of Omega’s omniscience the choice you make can’t change the amount of money in the box. As such traditional decision theory tells you to two- box because the decision you make can’t affect the amount of money the boxes. No matter how much money is in the boxes you should more by two boxing. Most educated people are causal decision makers by default. So a thought experiment where causal decision makers lose is paradox inducing. If one-boxing was the obvious choice people would feel the need to posit new decision theories as a result.
I disagree, and I think this is what Eliezer is hinting towards now I’ve gone back and re-read Newcomb’s Problem and Regret of Rationality. If you really have had sufficient evidence to believe that Omega is either an omniscient mind reader or some kind of acausal agent such that it makes sense to one-box then it makes sense to one-box. It only look like a paradox because you’re failing to imagine having that much evidence. Which incidentally is not really a problem—an inability to imagine highly implausible scenarios in detail is not generally an actual handicap in real world decision making.
I’m still going to two-box if Omega appears tomorrow though because there are very many more likely explanations for the series of events depicted in the story than the one you are supposed to take as given.