In other words, one of us did not specify the prediction correctly.
I don’t think it’s me. I deliberately didn’t say it’d destroy the world. Would it be correct to modify yours to say ”..and not making the world a worse place”?
No. If you look at the original bet with Eliezer, he was betting that on those conditions, the AI would literally destroy the world. In other words, if both of us are still around, and I’m capable of claiming the money, I win the bet, even if the world is worse off.
one of us did not specify the prediction correctly
Assuming that there is, in fact, a correct way to specify the predictions. It’s possible that you weren’t actually disagreeing and that you both assign substantial probability to (world is made worse off but not destroyed | non-FAI is created) while still having a low probability for (non-FAI is created in the next decade).
In other words, one of us did not specify the prediction correctly.
I don’t think it’s me. I deliberately didn’t say it’d destroy the world. Would it be correct to modify yours to say ”..and not making the world a worse place”?
No. If you look at the original bet with Eliezer, he was betting that on those conditions, the AI would literally destroy the world. In other words, if both of us are still around, and I’m capable of claiming the money, I win the bet, even if the world is worse off.
Yup. If he lives to collect, he collects.
Assuming that there is, in fact, a correct way to specify the predictions. It’s possible that you weren’t actually disagreeing and that you both assign substantial probability to (world is made worse off but not destroyed | non-FAI is created) while still having a low probability for (non-FAI is created in the next decade).