Especially in the age of AGI, leaders may no longer need to respect the values of most people because they’re not economically relevant.
Or militarily relevant. Traditionally, if you were a ruler, you had to at least keep your army happy. However, if you command an entirely automated army that doesn’t have any actual people in it, there’s no risk of the army turning against you. You have the robot weapons and nobody else does, so you can do whatever the hell you want to people without having to care what anyone else thinks.
Does any process in which they ended up the way they did without considering your decision procedure count as #2? Like, suppose almost all the other agents it expects to encounter are CDT agents that do give in to extortion, and it thinks the risk of nuclear war with the occasional rock or UDT agent is worth it.