There are multiple claims in the book that Harry will destroy the world. It starts in the first chapter with “The world will end”. Interessingly that wasn’t threre at the time the chapter was first published, but retrospectively added.
Creating a AI in the world is just a matter of creating a magical item.
Harry knows how to make them self aware.
Harry knows that magical creatures like trolls constantly self modify through magic.
Harry is into inventing new powerful spells.
All the pieces for building an AGI that goes FOOM are there in the book.
The more interesting claim is that an AGI actually goes FOOM.
Yeah. that was the claim I meant.
I say p=.65.
Would you be willing to bet on this? I’d be willing to bet 2 of my dollars against 1 of yours, that no AGI will go FOOM in the remainder of the HPMoR story (for a maximum of 200 of my dollars vs 100 of yours)
I’d be willing to bet 2 of my dollars against 1 of yours, that no AGI will go FOOM in the remainder of the HPMoR story (for a maximum of 200 of my dollars vs 100 of yours)
Even in early 2012, I didn’t think 2:1 was the odds for an AGI fooming in MoR...
How would you like to bet 1 of your dollars against 3 of my dollars that an AGI will go FOOM? Up to a max of 120 of my dollars and 40 of yours; ie. if an AGI goes foom, I pay you $120 and if it doesn’t, you pay me $40. (Payment through Paypal.) Given your expressed odds, this should look like a good deal to you.
ie. if an AGI goes foom, I pay you $120 and if it doesn’t, you pay me $40. (Payment through Paypal.) Given your expressed odds, this should look like a good deal to you.
Ι said I assign 2% probability on an AGI going FOOM in the story. So how would this look like a good deal for me?
The odds I offered to ChristianKI were meant to express a middle ground between the odds I expressed (2%) and the odds he expressed (65%) so that the bet would seem about equally profitable to both of us, given our stated probabilities.
Bah! Fine then, we won’t bet. IMO, you should have offered more generous terms. If your true probability is 2%, then that’s an odds against of 1:49, while his 65% would be 1:0.53, if I’m cranking the formula right. So a 1:2 doesn’t seem like a true split.
You are probably right about how it’s not a true split—I just did a stupid “add and divide by 2” on the percentages, but it doesn’t really work like that.. He would anticipate to lose once every 3 times, but given my percentages I anticipated to lose once every 50 times. (I’m not very mathy at all)
Would you be willing to bet on this? I’d be willing to bet 2 of my dollars against 1 of yours, that no AGI will go FOOM in the remainder of the HPMoR story (for a maximum of 200 of my dollars vs 100 of yours)
At the moment I unfortunately don’t have enough cash to invest in betting projects.
Additionally I don’t know Eliezer personally and there are people on LessWrong that do and which might have access to nonpublic information. As a result it’s not a good topic for betting money.
At the moment I unfortunately don’t have enough cash to invest in betting projects.
Fortunately, that’s why we have PredictionBook! Looking through my compilation of predictions (http://www.gwern.net/hpmor-predictions), I see we already have two relevant predictions:
No, it couldn’t.
There are multiple claims in the book that Harry will destroy the world. It starts in the first chapter with “The world will end”. Interessingly that wasn’t threre at the time the chapter was first published, but retrospectively added.
Creating a AI in the world is just a matter of creating a magical item. Harry knows how to make them self aware. Harry knows that magical creatures like trolls constantly self modify through magic. Harry is into inventing new powerful spells.
All the pieces for building an AGI that goes FOOM are there in the book.
I assign 2% probability on this scenario. What probability do you assign?
Given that the pieces the last time I read it p=.99 for that claim.
The more interesting claim is that an AGI actually goes FOOM. I say p=.65.
Yeah. that was the claim I meant.
Would you be willing to bet on this? I’d be willing to bet 2 of my dollars against 1 of yours, that no AGI will go FOOM in the remainder of the HPMoR story (for a maximum of 200 of my dollars vs 100 of yours)
Even in early 2012, I didn’t think 2:1 was the odds for an AGI fooming in MoR...
How would you like to bet 1 of your dollars against 3 of my dollars that an AGI will go FOOM? Up to a max of 120 of my dollars and 40 of yours; ie. if an AGI goes foom, I pay you $120 and if it doesn’t, you pay me $40. (Payment through Paypal.) Given your expressed odds, this should look like a good deal to you.
Ι said I assign 2% probability on an AGI going FOOM in the story. So how would this look like a good deal for me?
The odds I offered to ChristianKI were meant to express a middle ground between the odds I expressed (2%) and the odds he expressed (65%) so that the bet would seem about equally profitable to both of us, given our stated probabilities.
Bah! Fine then, we won’t bet. IMO, you should have offered more generous terms. If your true probability is 2%, then that’s an odds against of 1:49, while his 65% would be 1:0.53, if I’m cranking the formula right. So a 1:2 doesn’t seem like a true split.
You are probably right about how it’s not a true split—I just did a stupid “add and divide by 2” on the percentages, but it doesn’t really work like that.. He would anticipate to lose once every 3 times, but given my percentages I anticipated to lose once every 50 times. (I’m not very mathy at all)
At the moment I unfortunately don’t have enough cash to invest in betting projects.
Additionally I don’t know Eliezer personally and there are people on LessWrong that do and which might have access to nonpublic information. As a result it’s not a good topic for betting money.
Fortunately, that’s why we have PredictionBook! Looking through my compilation of predictions (http://www.gwern.net/hpmor-predictions), I see we already have two relevant predictions:
Harry will create a superintelligent AI using magic or magical objects
and it won’t be Friendly, killing many/all
(I’ve added a new more general one as well.)
I added my prediction to that.