I was thinking, ‘Find the person with the desires you can most easily satisfy—or whose desires allow the construction of the most easily-satisfied successor to the throne—and declare that person the utility monster by a trick of measurement.’ But I don’t think I quite understand the problem at hand. Eliezer sounds like he has in mind measurement-scheme criteria that would rule this out (as extrapolating the people who exist before the AI hopefully rules out brainwashing).
I gather a group of people, brainwash them to have some arbitrary collection of values, then somehow turn them into utility monsters?
I was thinking, ‘Find the person with the desires you can most easily satisfy—or whose desires allow the construction of the most easily-satisfied successor to the throne—and declare that person the utility monster by a trick of measurement.’ But I don’t think I quite understand the problem at hand. Eliezer sounds like he has in mind measurement-scheme criteria that would rule this out (as extrapolating the people who exist before the AI hopefully rules out brainwashing).