Everything you say after the ’No.” is true but doesn’t support your contradiction of:
I can’t think of one single case in my experience when the argument “It has a small probability of success, but we should pursue it, because the probability ifwe don’t try is zero” turned out to be a good idea.
Er … isn’t that the argument for cryonics?
There is no need to defend cryonics here. Just relax the generalisation. I’m surprised you ‘can’t think of a single case in your experience’ anyway. It took me 10 seconds to think of three in mine. Hardly surprising—such cases turn up whenever the payoffs multiply out right.
I think the kind of small probabilities Eliezer was talking about (not that he was specific) here is small in the sense that there is a small probability that evolution is wrong, there is a small probability that God exists, etc.
The other interpretation is something like there is a small probability you will hit your open-ended straight draw (31%). If there are at least two players other than you calling, though, it is always a good idea to call (excepting tournament and all-in considerations). So it depends on what interpretation you have of the word ‘small’.
By the first definition of small (vanishing), I can’t think of a single argument that was a good idea. By the second, I can think of thousands. So, the generalisation is leaky because of that word ‘small’. Instead of relaxing it, just tighten up the ‘small’ part.
That would certainly be a more reasonable position. (Except, obviously, where the payoffs were commensurably large. That obviously doesn’t happen often. Situations like “3 weeks to live, can’t afford cryonics are the only kind of exception that spring to mind.”)
We might be thinking of different generalizations here.
Almost certainly. I am specifically referring to the generalisation quoted by David. It is, in fact, exactly the reasoning I used when I donated to the SIAI. Specifically, I estimate the probability of me or even humanity surviving for the long term if we don’t pull off FAI to be vanishingly small (like that of winning the lottery by mistake without buying a ticket) so donated to support FAI research even though I think it to be, well, “impossible”.
More straightforward examples crop up all the time when playing games. Just last week I bid open misere when I had a 10% chance of winning—the alternatives of either passing or making a 9 call were guaranteed losses of the 500 game.
Everything you say after the ’No.” is true but doesn’t support your contradiction of:
There is no need to defend cryonics here. Just relax the generalisation. I’m surprised you ‘can’t think of a single case in your experience’ anyway. It took me 10 seconds to think of three in mine. Hardly surprising—such cases turn up whenever the payoffs multiply out right.
I think the kind of small probabilities Eliezer was talking about (not that he was specific) here is small in the sense that there is a small probability that evolution is wrong, there is a small probability that God exists, etc.
The other interpretation is something like there is a small probability you will hit your open-ended straight draw (31%). If there are at least two players other than you calling, though, it is always a good idea to call (excepting tournament and all-in considerations). So it depends on what interpretation you have of the word ‘small’.
By the first definition of small (vanishing), I can’t think of a single argument that was a good idea. By the second, I can think of thousands. So, the generalisation is leaky because of that word ‘small’. Instead of relaxing it, just tighten up the ‘small’ part.
Redefinition not supported by the context.
I already noted that Eliezer was not specific enough to support that redefinition. I was offering an alternate course of action for Eliezer to take.
That would certainly be a more reasonable position. (Except, obviously, where the payoffs were commensurably large. That obviously doesn’t happen often. Situations like “3 weeks to live, can’t afford cryonics are the only kind of exception that spring to mind.”)
Name one? We might be thinking of different generalizations here.
Almost certainly. I am specifically referring to the generalisation quoted by David. It is, in fact, exactly the reasoning I used when I donated to the SIAI. Specifically, I estimate the probability of me or even humanity surviving for the long term if we don’t pull off FAI to be vanishingly small (like that of winning the lottery by mistake without buying a ticket) so donated to support FAI research even though I think it to be, well, “impossible”.
More straightforward examples crop up all the time when playing games. Just last week I bid open misere when I had a 10% chance of winning—the alternatives of either passing or making a 9 call were guaranteed losses of the 500 game.