The problem is that in this specific case “the world state you want” is more or less defined as something that is easy to model (because you are rewarded when your models for the world), which may give you incentives to destroy exceptional complicated things… such as life.
The problem is that in this specific case “the world state you want” is more or less defined as something that is easy to model (because you are rewarded when your models for the world), which may give you incentives to destroy exceptional complicated things… such as life.