“This talk about “‘right’ means right” still makes me damn uneasy. I don’t have more to show for it than “still feels a little forced”—when I visualize a humane mind (say, a human) and a paperclipper (a sentient, moral one) looking at each other in horror and knowing there is no way they could agree about whether using atoms to feed babies or make paperclips, I feel wrong. I think about the paperclipper in exactly the same way it thinks about me! Sure, that’s also what happens when I talk to a creationist, but we’re trying to approximate external truth; and if our priors were too stupid, our genetic line would be extinct (or at least that’s what I think) - but morality doesn’t work like probability, it’s not trying to approximate anything external. So I don’t feel so happier about the moral miracle that made us than about the one that makes the paperclipper.”
Oh my, this is so wrong. So you’re postulating that the paperclipper would be extinct too due to natural selection? Somehow I don’t see the mechanisms of natural selection applying to that. With it being created once by humans and then exploding, and all that.
If 25% of its “moral drive” is the result of a programming error, is it still “understandable and as much of a worthy creature/shaper of the Universe” as us? This is the cosmopolitan view that Eliezer describes; and I don’t see how you’re convinced that admiring static is just as good as admiring evolved structure. It might just be bias but the later seems much better. Order > chaos, no?
“This talk about “‘right’ means right” still makes me damn uneasy. I don’t have more to show for it than “still feels a little forced”—when I visualize a humane mind (say, a human) and a paperclipper (a sentient, moral one) looking at each other in horror and knowing there is no way they could agree about whether using atoms to feed babies or make paperclips, I feel wrong. I think about the paperclipper in exactly the same way it thinks about me! Sure, that’s also what happens when I talk to a creationist, but we’re trying to approximate external truth; and if our priors were too stupid, our genetic line would be extinct (or at least that’s what I think) - but morality doesn’t work like probability, it’s not trying to approximate anything external. So I don’t feel so happier about the moral miracle that made us than about the one that makes the paperclipper.”
Oh my, this is so wrong. So you’re postulating that the paperclipper would be extinct too due to natural selection? Somehow I don’t see the mechanisms of natural selection applying to that. With it being created once by humans and then exploding, and all that.
If 25% of its “moral drive” is the result of a programming error, is it still “understandable and as much of a worthy creature/shaper of the Universe” as us? This is the cosmopolitan view that Eliezer describes; and I don’t see how you’re convinced that admiring static is just as good as admiring evolved structure. It might just be bias but the later seems much better. Order > chaos, no?