Eliezer, this post seems to me to reinforce, not weaken, a “God to rule us all” image. Oh, and among the various clues that might indicate to me that someone would make a good choice with power, the ability to recreate that power from scratch does not seem a particularly strong clue.
among the various clues that might indicate to me that someone would make a good choice with power, the ability to recreate that power from scratch does not seem a particularly strong clue.
That was my first reaction as well, but Eliezer must have intentionally chosen a “clue” that is not too strong. After all, an FAI doesn’t really need to use any clues—it can just disallow any choice that is not actually good (except that would destroy the feeling of free will). So I think “let someone make a choice with a power if they can recreate that power from scratch” is meant to be an example of the kind of tradeoff an FAI might make between danger and freedom.
What I don’t understand is, since this is talking about people born after the Singularity, why do parents continue to create children who are so prone to making bad choices. I can understand not wanting to take an existing person and forcibly “fix” them, but is there supposed to be something in our CEV that says even new beings created from scratch must have a tendency to make wrong choices to be maximally valuable?
Eliezer, this post seems to me to reinforce, not weaken, a “God to rule us all” image. Oh, and among the various clues that might indicate to me that someone would make a good choice with power, the ability to recreate that power from scratch does not seem a particularly strong clue.
That was my first reaction as well, but Eliezer must have intentionally chosen a “clue” that is not too strong. After all, an FAI doesn’t really need to use any clues—it can just disallow any choice that is not actually good (except that would destroy the feeling of free will). So I think “let someone make a choice with a power if they can recreate that power from scratch” is meant to be an example of the kind of tradeoff an FAI might make between danger and freedom.
What I don’t understand is, since this is talking about people born after the Singularity, why do parents continue to create children who are so prone to making bad choices. I can understand not wanting to take an existing person and forcibly “fix” them, but is there supposed to be something in our CEV that says even new beings created from scratch must have a tendency to make wrong choices to be maximally valuable?