Lots of incorrect answers in other replies to this one. The real answer is that, from Luke’s perspective, creating Luke-friendly AI and becoming king of the universe isn’t much better than creating regular friendly AI and getting the same share of the universe as any other human. Because it turns out, after the first thousand galaxies worth of resources and trillion trillion millenia of lifespan, you hit such diminishing returns that having another seven-billion times as many resources isn’t a big deal.
This isn’t true for every value—he might assign value to certain things not existing, like powerful people besides him, which other people want to exist. And that last factor of seven billion is worth something. But these are tiny differences in value, utterly dwarfed by the reduced AI-creation success-rate that would happen if the programmers got into a flamewar over who should be king.
Lots of incorrect answers in other replies to this one. The real answer is that, from Luke’s perspective, creating Luke-friendly AI and becoming king of the universe isn’t much better than creating regular friendly AI and getting the same share of the universe as any other human. Because it turns out, after the first thousand galaxies worth of resources and trillion trillion millenia of lifespan, you hit such diminishing returns that having another seven-billion times as many resources isn’t a big deal.
This isn’t true for every value—he might assign value to certain things not existing, like powerful people besides him, which other people want to exist. And that last factor of seven billion is worth something. But these are tiny differences in value, utterly dwarfed by the reduced AI-creation success-rate that would happen if the programmers got into a flamewar over who should be king.