Did you notice that, as phrased in the link, your bet is about the following event: “[at a certain point in time under a few conditions] it will be interesting to hear Eliezer’s excuses”? Technically, all Eliezer will have to do to win the bet will be to write a boring excuse.
(I am the original Unknown but I had to change my name when we moved from Overcoming Bias to Less Wrong because I don’t know how to access the other account.)
Any chance you and Eliezer could set a date on your bet? I’d like to import the 3 open bets to Prediction Book, but I need a specific date. (PB, rightly, doesn’t do open-ended predictions.)
eg. perhaps 2100, well after many Singularitarians expect some sort of AI, and also well after both of your actuarial death dates.
If we agreed on that date, what would happen in the event that there was no AI by that time and both of us are still alive? (These conditions are surely very unlikely but there has to be some determinate answer anyway.)
donate the money to charity under the view ‘and you’re both wrong, so there!’
say that the prediction is implicitly a big AND - ‘there will be an AI by 2100 AND said first AI will not have… etc.‘, and that the conditions allow ‘short-circuiting’ when any AI is created; with this change, reaching 2100 is a loss on your part.
Like #2, but the loss is on Eliezer’s part (the bet changes to ‘I think there won’t be an AI by 2100, but if there is, it won’t be Friendly and etc.’)
I like #2 better since I dislike implicit premises and this (while you two are still relatively young and healthy) is as good a time as any to clarify the terms. But #1 follows more the Long Bets formula.
Eliezer and I are probably about equally confident that “there will not be AI by 2100, and both Eliezer and Unknown will still be alive” is incorrect. So it doesn’t seem very fair to select either 2 or 3. So option 1 seems better.
I agree with your post, especially since I expect to win my bet with Eliezer.
Did you notice that, as phrased in the link, your bet is about the following event: “[at a certain point in time under a few conditions] it will be interesting to hear Eliezer’s excuses”? Technically, all Eliezer will have to do to win the bet will be to write a boring excuse.
Eliezer was the one who linked to that: the bet is about whether those conditions will be satisfied.
Anyway, he has already promised (more or less) not to make excuses if I win.
I don’t know what this bet is, and I don’t see a link anywhere in your post.
http://wiki.lesswrong.com/wiki/Bets_registry
(I am the original Unknown but I had to change my name when we moved from Overcoming Bias to Less Wrong because I don’t know how to access the other account.)
Any chance you and Eliezer could set a date on your bet? I’d like to import the 3 open bets to Prediction Book, but I need a specific date. (PB, rightly, doesn’t do open-ended predictions.)
eg. perhaps 2100, well after many Singularitarians expect some sort of AI, and also well after both of your actuarial death dates.
If we agreed on that date, what would happen in the event that there was no AI by that time and both of us are still alive? (These conditions are surely very unlikely but there has to be some determinate answer anyway.)
You could either
donate the money to charity under the view ‘and you’re both wrong, so there!’
say that the prediction is implicitly a big AND - ‘there will be an AI by 2100 AND said first AI will not have… etc.‘, and that the conditions allow ‘short-circuiting’ when any AI is created; with this change, reaching 2100 is a loss on your part.
Like #2, but the loss is on Eliezer’s part (the bet changes to ‘I think there won’t be an AI by 2100, but if there is, it won’t be Friendly and etc.’)
I like #2 better since I dislike implicit premises and this (while you two are still relatively young and healthy) is as good a time as any to clarify the terms. But #1 follows more the Long Bets formula.
Eliezer and I are probably about equally confident that “there will not be AI by 2100, and both Eliezer and Unknown will still be alive” is incorrect. So it doesn’t seem very fair to select either 2 or 3. So option 1 seems better.