Please let me interrupt this discussion on utilitarianism/humanism with an alternative perspective.
I do not claim to know what the meaning of life is, but I can rule certain answers out. For example, I am highly certain that it is not to maximize the number of paperclips in my vicinity.
I also believe it has nothing to do with how much pain or pleasure the humans experience—or in fact anything to do with the humans.
More broadly, I believe that although perhaps intelligent or ethical agents are somehow integral to the meaning of life, they are integral for what they do, not because the success or failure of the universe hinges somehow on what the sentients experience or whether their preferences or desires are realized.
Humans, or rather human civilization, which is an amalgam of humans, knowledge and machines, are of course the most potent means anyone knows about for executing plans and achieving goals. Hobble the humans and you probably hobble whatever it is that really is the meaning of life.
But I firmly reject humanity as repository of ultimate moral value.
I looks to me like Eliezer plans to put humanism at the center of the intelligence explosion. I think that is a bad idea. I am horrified. I am appalled.
Please let me interrupt this discussion on utilitarianism/humanism with an alternative perspective.
I do not claim to know what the meaning of life is, but I can rule certain answers out. For example, I am highly certain that it is not to maximize the number of paperclips in my vicinity.
I also believe it has nothing to do with how much pain or pleasure the humans experience—or in fact anything to do with the humans.
More broadly, I believe that although perhaps intelligent or ethical agents are somehow integral to the meaning of life, they are integral for what they do, not because the success or failure of the universe hinges somehow on what the sentients experience or whether their preferences or desires are realized.
Humans, or rather human civilization, which is an amalgam of humans, knowledge and machines, are of course the most potent means anyone knows about for executing plans and achieving goals. Hobble the humans and you probably hobble whatever it is that really is the meaning of life.
But I firmly reject humanity as repository of ultimate moral value.
I looks to me like Eliezer plans to put humanism at the center of the intelligence explosion. I think that is a bad idea. I am horrified. I am appalled.