Often, when I stop to think about a decision, I find that my desire changes upon reflection. The latter desire generally seems more intellectually coherent(*), and across multiple instances, the initial desires on various occasions are generally more inconsistent with one another while the after-reflection desires are generally more consistent with one another. From this I infer the existence of a (possibly only vague, partially specified, or partially consistent) common cause to the various instances’ after-reflection desires. This common cause appears to roughly resemble a bundle of heuristics that collectively approximate some sort of optimization criteria. I call the bundle of heuristics my “moral intuition” and the criteria they approximate my “morality”.
I suspect that other human’s minds are broadly similar to mine in this respect, and that their moral intuitions are broadly similar to mine. To the extent they correlate, we might call the set of common trends “human morality” or “humaneness”.
(*) An example of intellectual coherence vs. incoherence: Right now, I’d like to go get some ice cream from the freezer. However, on reflection, I remember that there isn’t any ice cream in the freezer at the moment, so walking over to the freezer would not satisfy the impulse that motivated the action.
Edit: Well, sort of. Some of their values partially coincide with ours. But one of the major themes of the story is that we should expect aliens to have inhumane value systems.
Often, when I stop to think about a decision, I find that my desire changes upon reflection. The latter desire generally seems more intellectually coherent(*), and across multiple instances, the initial desires on various occasions are generally more inconsistent with one another while the after-reflection desires are generally more consistent with one another. From this I infer the existence of a (possibly only vague, partially specified, or partially consistent) common cause to the various instances’ after-reflection desires. This common cause appears to roughly resemble a bundle of heuristics that collectively approximate some sort of optimization criteria. I call the bundle of heuristics my “moral intuition” and the criteria they approximate my “morality”.
I suspect that other human’s minds are broadly similar to mine in this respect, and that their moral intuitions are broadly similar to mine. To the extent they correlate, we might call the set of common trends “human morality” or “humaneness”.
(*) An example of intellectual coherence vs. incoherence: Right now, I’d like to go get some ice cream from the freezer. However, on reflection, I remember that there isn’t any ice cream in the freezer at the moment, so walking over to the freezer would not satisfy the impulse that motivated the action.
What about the Baby-Eaters and the Super Happy People in the story Three Worlds Collide? Do they have anything you would call “humaneness”?
No.
Edit: Well, sort of. Some of their values partially coincide with ours. But one of the major themes of the story is that we should expect aliens to have inhumane value systems.