Seems to me that if you weren’t human, you wouldn’t care about morality (and instead care about paperclips or whatever).
If you take “morality” to be “my peculiar preference for the letter v,” but it seems to me that a more natural meaning of “morality” is “things other people should do.” Any agent which interacts with other agents has both a vested stake in how windfalls are distributed and in the process used to determine how windfalls are distributed, and so I’d like to talk about “fair” in a way that paperclippers, pebblesorters, and humans find interesting.
That is, how is it difficult to think about “my particular value system,” “value systems in general,” “my particular protocols for interaction,” and “protocols for interaction in general” as different things? Why, when Eliezer is so quick to taboo words and get to the heart of things in other areas, does he not do so here?
So even if you try to imagine yourself as some kind of neutral disembodied mind, the fact that this mind is interested in morality (instead of paperclips) shows that it’s a human in disguise.
But when modelling a paperclipper, the neutral disembodied mind isn’t interested in human morality, and is interested in paperclips, and thinks of desire for paperclips as the universal impulse. That is to say, I think I have more control over my interests than this thought experiment is presuming.
Sort of? I’m not trying to explain morality, but label it, and I think that the word “should” makes a decent label for the cluster of things which make up the “morality” I was trying to point to. The other version I came up with was like thirty words long, and I figured that ‘should’ was a better choice than that.
If you take “morality” to be “my peculiar preference for the letter v,” but it seems to me that a more natural meaning of “morality” is “things other people should do.” Any agent which interacts with other agents has both a vested stake in how windfalls are distributed and in the process used to determine how windfalls are distributed, and so I’d like to talk about “fair” in a way that paperclippers, pebblesorters, and humans find interesting.
That is, how is it difficult to think about “my particular value system,” “value systems in general,” “my particular protocols for interaction,” and “protocols for interaction in general” as different things? Why, when Eliezer is so quick to taboo words and get to the heart of things in other areas, does he not do so here?
But when modelling a paperclipper, the neutral disembodied mind isn’t interested in human morality, and is interested in paperclips, and thinks of desire for paperclips as the universal impulse. That is to say, I think I have more control over my interests than this thought experiment is presuming.
You’ve passed the recursive buck here.
Sort of? I’m not trying to explain morality, but label it, and I think that the word “should” makes a decent label for the cluster of things which make up the “morality” I was trying to point to. The other version I came up with was like thirty words long, and I figured that ‘should’ was a better choice than that.