Follow your priors. The problem here is that the prior for Hell has been constructed, “artificially”, to have unnaturally high probability.
I think my claim was that your example was kinda bad, since it’s not obvious that the AI is doing anything wrong, but on reflection I realized that it doesn’t really matter, since I can easily generate a better example.
Follow your priors. The problem here is that the prior for Hell has been constructed, “artificially”, to have unnaturally high probability.
I think my claim was that your example was kinda bad, since it’s not obvious that the AI is doing anything wrong, but on reflection I realized that it doesn’t really matter, since I can easily generate a better example.