Phil, very well articulated and interesting stuff. Have you seen Wall-E? It’s the scenario your post warns against, but with physical instead of evolutionary fitness.
I agree that Eliezer seems to have brushed aside your viewpoint withough giving it due deliberation, when the topic of the ethics of transcending evolution seems right up his street for blogging on.
However: It considers the “preferences” (which I, being a materialist, interpret as “statistical tendencies” of organisms, or of populations; but not of the dynamic system. Why do you discriminate against the larger system?
Because he can. You’re straying close to the naturalistic fallacy here. Just as soon as natural selection gets around to building a Bayesian superintelligence, it can specify whatever function it wants to. We build the AI, we get to give it our preferences. What’s unfair about that?
Besides, we departed from selection’s straight-and-narrow when we made chocolate, condoms, penicillin and spacecraft. We are what selection made us, with our thousand shards of desire, but I see no reason why we should be constrained by that. Our ethics are long since divorced from their evolutionary origins. It’s understandable to worry that this makes them vulnerable—I think we all do. It won’t be easy bringing them with us into the future, but that’s why we’re working hard at it.
@Lara: what ‘humans’ want as a complied whole is not what we’ll want as individuals
Great description of why people in democracies bitch constantly but never rise up. The collective gets what it wants but the individuals are never happy. If I was a superintelligence I’d just paperclip us all and be done with it.
Phil, very well articulated and interesting stuff. Have you seen Wall-E? It’s the scenario your post warns against, but with physical instead of evolutionary fitness.
I agree that Eliezer seems to have brushed aside your viewpoint withough giving it due deliberation, when the topic of the ethics of transcending evolution seems right up his street for blogging on.
However: It considers the “preferences” (which I, being a materialist, interpret as “statistical tendencies” of organisms, or of populations; but not of the dynamic system. Why do you discriminate against the larger system?
Because he can. You’re straying close to the naturalistic fallacy here. Just as soon as natural selection gets around to building a Bayesian superintelligence, it can specify whatever function it wants to. We build the AI, we get to give it our preferences. What’s unfair about that?
Besides, we departed from selection’s straight-and-narrow when we made chocolate, condoms, penicillin and spacecraft. We are what selection made us, with our thousand shards of desire, but I see no reason why we should be constrained by that. Our ethics are long since divorced from their evolutionary origins. It’s understandable to worry that this makes them vulnerable—I think we all do. It won’t be easy bringing them with us into the future, but that’s why we’re working hard at it.
@Lara: what ‘humans’ want as a complied whole is not what we’ll want as individuals
Great description of why people in democracies bitch constantly but never rise up. The collective gets what it wants but the individuals are never happy. If I was a superintelligence I’d just paperclip us all and be done with it.