That’s one element in what started my line of thought..I was imagining situations where I would consider the exchange of human lives for non-human objects. How many people’s lives would be a fair exchange for a pod of bottlenose dolphins? A West Virginia mountaintop? An entire species of snail?
I think what I’m getting towards is there’s a difference between human preferences and human preference for other humans. And by human preferences, I mean my own.
I think what I’m getting towards is there’s a difference between human preferences and human preference for other humans. And by human preferences, I mean my own.
That is one objection to Coherent Extrapolated Volition (CEV), i.e. that human values are too diverse.
Though the space of possible futures that an AGI could spit out is VERY large compared to the space of futures people would want, even if one takes into consideration the diversity of human values.
That’s one element in what started my line of thought..I was imagining situations where I would consider the exchange of human lives for non-human objects. How many people’s lives would be a fair exchange for a pod of bottlenose dolphins? A West Virginia mountaintop? An entire species of snail?
I think what I’m getting towards is there’s a difference between human preferences and human preference for other humans. And by human preferences, I mean my own.
That is one objection to Coherent Extrapolated Volition (CEV), i.e. that human values are too diverse. Though the space of possible futures that an AGI could spit out is VERY large compared to the space of futures people would want, even if one takes into consideration the diversity of human values.