It’s possible in principle to build a simulation that is literally indistinguishable from reality. Say we only run the AI in simulation for 100million years, and there’s a simulation overhead of 10x. That should cost (100e6 ly)**3*(100e6 years) * 10 of our future lightcone. This is a minuscule fraction of our actual future lightcone (9.4e10 ly) * (10^15 y)
A few better objections:
Simulating a universe with a paperclip maximizer in it means simulating billions of people being murdered and turned into paperclips. If we believe computation=existence, that’s hugely morally objectionable.
The AGI’s prior that it is in a simulation doesn’t depend on anything we do, only on the universal prior.
“reality is large” is a bad objection.
It’s possible in principle to build a simulation that is literally indistinguishable from reality. Say we only run the AI in simulation for 100million years, and there’s a simulation overhead of 10x. That should cost (100e6 ly)**3*(100e6 years) * 10 of our future lightcone. This is a minuscule fraction of our actual future lightcone (9.4e10 ly) * (10^15 y)
A few better objections:
Simulating a universe with a paperclip maximizer in it means simulating billions of people being murdered and turned into paperclips. If we believe computation=existence, that’s hugely morally objectionable.
The AGI’s prior that it is in a simulation doesn’t depend on anything we do, only on the universal prior.