Or even if the AI experienced an intelligence explosion the danger is that it would not believe it had really become so important because the prior odds of you being the most important thing that will probably ever exist is so low.
Edit: The AI could note that it uses a lot more computing power than any other sentient and so give itself an anothropic weight much greater than 1.
Related: Would an AI conclude it’s likely to be a Boltzmann brain? ;)
Everyone’s a Boltzmann brain to some degree.
Or even if the AI experienced an intelligence explosion the danger is that it would not believe it had really become so important because the prior odds of you being the most important thing that will probably ever exist is so low.
Edit: The AI could note that it uses a lot more computing power than any other sentient and so give itself an anothropic weight much greater than 1.
With respect to this being a “danger,” don’t Boltzmann brains have a decision-theoretic weight of zero?
Why zero? If you came to believe there was a 99.99999% chance you are currently dreaming wouldn’t it effect your choices?