The next LW Brussels meetup will be about morality, and I want to have a bunch of moral dilemmas prepared as conversation-starters. And I mean moral dilemmas that you can’t solve with one easy utilitarian calculation. Some in the local community have had little exposure to LW articles, so I’ll definitely mention standard trolley problems and “torture vs dust specks”, but I’m curious if you have more original ones.
It’s fine if some of them use words that should really be tabooed. The discussion will double as a taboo exercise.
A lot of what I came up with revolves around the boundaries of sentience. I.e. on a scale that goes from self-replicating amino acid to transhumans (and includes animals, babies, the heavily mentally handicapped...), where do you place things like “I have a moral responsibility to uplift those to normal human intelligence once the technology is available” or “it’s fine if I kill/eat/torture those”, and how much of one kind of life you’d be willing to trade off for a superior kind. Do I have a moral responsibility to uplift babies? Uh-
Trading off lives for things whose value is harder to put on the same scale is also interesting. I.e. “will you save this person, or this priceless cultural artifact, or this species near extinction.” (Yes, I’ve seen the SMBC.)
(Reposted from the LW facebook group)
The next LW Brussels meetup will be about morality, and I want to have a bunch of moral dilemmas prepared as conversation-starters. And I mean moral dilemmas that you can’t solve with one easy utilitarian calculation. Some in the local community have had little exposure to LW articles, so I’ll definitely mention standard trolley problems and “torture vs dust specks”, but I’m curious if you have more original ones.
It’s fine if some of them use words that should really be tabooed. The discussion will double as a taboo exercise.
A lot of what I came up with revolves around the boundaries of sentience. I.e. on a scale that goes from self-replicating amino acid to transhumans (and includes animals, babies, the heavily mentally handicapped...), where do you place things like “I have a moral responsibility to uplift those to normal human intelligence once the technology is available” or “it’s fine if I kill/eat/torture those”, and how much of one kind of life you’d be willing to trade off for a superior kind. Do I have a moral responsibility to uplift babies? Uh-
Trading off lives for things whose value is harder to put on the same scale is also interesting. I.e. “will you save this person, or this priceless cultural artifact, or this species near extinction.” (Yes, I’ve seen the SMBC.)
I thought of this moral dillema
There are two options.
You experience a significant amount of pain, 5 minutes later you completly forget about the experience as if you were never in pain at all.
You experience a slightly less amount of pain then option 1 but you don’t forget it.
Which one would you choose?