I’d welcome more quality discussion of philosophical topics such as morality here. You occasionally see people pop up and say confused things about morality, like
It has been suggested that animals have less subjective experience than people. For example, it would be possible to have an animal that counts as half a human for the purposes of morality.
… that got downvoted, but I still get the impression that confused thinking like that pops up more often on the topic of morality than on others (except Friendly AI), and that Eliezer didn’t do a good enough job teaching sane and clear thinking about morality to his readers—including myself.
And morality is a topic that’s whack in the middle of philosophy, and AI and statistics don’t teach us much about it (though cognitive science and experimental philosophy do). So I have the hope that more input from academic philosophy might raise the quality of thinking here about morality.
Why is the quoted example confused? It seems to be that subjective experience has something to do with morality, and in such a way that having less of it would make you less morally significant.
Possibly “something to do with morality”, yes, but moral worth isn’t equal to subjective experience to the point that you can use that to calculate the ratio between “how much some animal is worth” and “how much a human is worth”. Or, maybe it is, but we’d need an actual argument, not just assuming it’s so.
I’d welcome more quality discussion of philosophical topics such as morality here. You occasionally see people pop up and say confused things about morality, like
… that got downvoted, but I still get the impression that confused thinking like that pops up more often on the topic of morality than on others (except Friendly AI), and that Eliezer didn’t do a good enough job teaching sane and clear thinking about morality to his readers—including myself.
And morality is a topic that’s whack in the middle of philosophy, and AI and statistics don’t teach us much about it (though cognitive science and experimental philosophy do). So I have the hope that more input from academic philosophy might raise the quality of thinking here about morality.
Metaethics is my specialty, so I’ve got some ‘dissolving moral problems’ posts coming up, but I need to write some dependencies first.
Why is the quoted example confused? It seems to be that subjective experience has something to do with morality, and in such a way that having less of it would make you less morally significant.
Possibly “something to do with morality”, yes, but moral worth isn’t equal to subjective experience to the point that you can use that to calculate the ratio between “how much some animal is worth” and “how much a human is worth”. Or, maybe it is, but we’d need an actual argument, not just assuming it’s so.