Give me five dollars, or I will kill as many puppies as it takes to make you. And they’ll go to hell. And there in that hell will be fire, brimstone, and rap with Engrish lyrics.
I think the problem is not Solomonoff inducton or Kolmogorov complexity or Bayesian rationality, whatever the difference is, but you. You don’t want an AI to think like this because you don’t want it to kill you. Meanwhile, to a true altruist, it would make perfect sense.
Not really confident. It’s obvious that no society of selfish beings whose members think like this could function. But they’d still, absurdly, be happier on average.
Well, in that case, one possible response is for me to kill YOU (or report you to the police who will arrest you for threatening mass animal cruelty). But if you’re really a super-intelligent being from beyond the simulation, then trying to kill you will inevitably fail and probably cause those 3^^^^3 people to suffer as a result.
(The most plausible scenario in which a Pascal’s Mugging occurs? Our simulation is being tested for its coherence in expected utility calculations. Fail the test and the simulation will be terminated.)
Give me five dollars, or I will kill as many puppies as it takes to make you. And they’ll go to hell. And there in that hell will be fire, brimstone, and rap with Engrish lyrics.
I think the problem is not Solomonoff inducton or Kolmogorov complexity or Bayesian rationality, whatever the difference is, but you. You don’t want an AI to think like this because you don’t want it to kill you. Meanwhile, to a true altruist, it would make perfect sense.
Not really confident. It’s obvious that no society of selfish beings whose members think like this could function. But they’d still, absurdly, be happier on average.
Well, in that case, one possible response is for me to kill YOU (or report you to the police who will arrest you for threatening mass animal cruelty). But if you’re really a super-intelligent being from beyond the simulation, then trying to kill you will inevitably fail and probably cause those 3^^^^3 people to suffer as a result.
(The most plausible scenario in which a Pascal’s Mugging occurs? Our simulation is being tested for its coherence in expected utility calculations. Fail the test and the simulation will be terminated.)