Anon prime: dollars are not utility. Economic egalitarianism is instrumentally desirable. We don’t normally favor all types of equality, as Robin frequently points out.
Kyle: cute
Eliezer: My impulse is to choose the torture, even when I imagine very bad kinds of torture and very small annoyances (I think that one can go smaller than a dust mote, possibly something like a letter on the spine of a book that your eye sweeps over being in a shade less well selected a font). Then, however, I think of how much longer the torture could last and still not outweigh the trivial annoyances if I am to take the utilitarian perspective and my mind breaks. Condoning 50 years of torture, or even a day worth, is pretty much the same as condoning universes of agonium lasting for eons in the face of numbers like these, and I don’t think that I can condone that for any amount of a trivial benefit.
Well too bad he didn’t wait a year longer then ;). I think preferring torture is the wrong answer for the same reason that I think universal health-care is a good idea. The financial cost of serious illness and injury is distributed over the taxpaying population so no single individual has to deal with a spike in medical costs ruining their life. And I think it’s still the correct moral choice regardless of whether universal health-care happens to be more expensive or not.
Analogous I think the exact same applies to dust vs torture. I don’t think the correct moral choice is about minimizing the total area under the pain-curve at all, it’s about avoiding severe pain-spikes for any given individual even at the cost of having a larger area under the curve. I don’t think “shut up and multiply” applies here in it’s simplistic conception in the way it might apply in the scenario where you have to choose whether 400 people live for sure or 500 people live with .9 probability (and die with .1 probability).
Irrespective of the former however, the thought experiment is a bit problematic because it’s more complex than apparent at first, if we really take it seriously. Eliezer said the dust-specks are “barely noticed”, but being conscious or aware of something isn’t an either-or thing, awareness falls on a continuum so whatever “pain” the dust-specks causes has to be multiplied by how aware the person really is. If someone is tortured that person is presumably very aware of the physical and emotional pain.
Other possible consequences like lasting damage or social repercussions not counting, I don’t really care all that much about any kind of pain that happens to me while I’m not aware of it. I could probably figure out whether or not pain is actually registered in my brain during having my upcoming operation under anesthesia, but the fact that I won’t bother tells me very clearly, that awareness of pain is an important weight we have to multiply in some fashion with the actual pain-registration in the brain.
That’s just an additional consideration though, even if we simplify it and imagine the pain is directly comparable and has no difference in quality at all, while the total quantity of pain is excessively higher in the dust-scenario compared to the torture-scenario, it changes nothing about my current choice.
So what does that tell me about the relationship between utility and morality? I don’t accept that morality is just about the total lump sums of utility and disutility, I think we also have to consider the distribution of those in any given population. Why is that I ask myself and my brain offers the following answer to this question:
If I was the only agent in the entire universe and had to pick torture vs dust for myself (and obviously if I was immortal/ had a long enough life to experience all those dust specks), I would still prefer the larger area under the curve over the pain-spike, even if I assume direct comparability of the two kinds of pain. I suspect the reason for this choice is a type of time-discounting my brain does, I’d rather suffer a little pain every day for a trillion years than a big spike for 50 years. Considering that briefly speaking utility is (or at least I think should be defined as) a thing that only results from the interaction of minds and environments, my mind and its workings are definitely part of the equation that says what has utility and what doesn’t. And my mind wants to suffer low disutility evenly distributed over a long time-period rather than suffer great disutility for a 50 year spike (assuming a trillion-year lifetime).
I don’t think the correct moral choice is about minimizing the total area under the pain-curve at all, it’s about avoiding severe pain-spikes for any given individual even at the cost of having a larger area under the curve.
If you’re going to say that, you’ll need some threshhold, and pain over the threshhold makes the whole society count as worse than pain under the threshhold. This will mean that any number of people with pain X is better than one person with pain X + epsilon, where epsilon is very small but happens to push it over the threshhold.
Alternately, you could say that the disutility of pain gradually changes, but that has other problems. I suggest you read up on the repugnant conclusion ( http://plato.stanford.edu/entries/repugnant-conclusion/ )--depending on exactly what you mean, what you suggest is similar to the proposed solutions, which don’t really work.
Anon prime: dollars are not utility. Economic egalitarianism is instrumentally desirable. We don’t normally favor all types of equality, as Robin frequently points out.
Kyle: cute
Eliezer: My impulse is to choose the torture, even when I imagine very bad kinds of torture and very small annoyances (I think that one can go smaller than a dust mote, possibly something like a letter on the spine of a book that your eye sweeps over being in a shade less well selected a font). Then, however, I think of how much longer the torture could last and still not outweigh the trivial annoyances if I am to take the utilitarian perspective and my mind breaks. Condoning 50 years of torture, or even a day worth, is pretty much the same as condoning universes of agonium lasting for eons in the face of numbers like these, and I don’t think that I can condone that for any amount of a trivial benefit.
(This was my favorite reply, BTW.)
I admire the restraint involved in waiting nearly five years before selecting a favorite.
Well too bad he didn’t wait a year longer then ;). I think preferring torture is the wrong answer for the same reason that I think universal health-care is a good idea. The financial cost of serious illness and injury is distributed over the taxpaying population so no single individual has to deal with a spike in medical costs ruining their life. And I think it’s still the correct moral choice regardless of whether universal health-care happens to be more expensive or not.
Analogous I think the exact same applies to dust vs torture. I don’t think the correct moral choice is about minimizing the total area under the pain-curve at all, it’s about avoiding severe pain-spikes for any given individual even at the cost of having a larger area under the curve. I don’t think “shut up and multiply” applies here in it’s simplistic conception in the way it might apply in the scenario where you have to choose whether 400 people live for sure or 500 people live with .9 probability (and die with .1 probability).
Irrespective of the former however, the thought experiment is a bit problematic because it’s more complex than apparent at first, if we really take it seriously. Eliezer said the dust-specks are “barely noticed”, but being conscious or aware of something isn’t an either-or thing, awareness falls on a continuum so whatever “pain” the dust-specks causes has to be multiplied by how aware the person really is. If someone is tortured that person is presumably very aware of the physical and emotional pain.
Other possible consequences like lasting damage or social repercussions not counting, I don’t really care all that much about any kind of pain that happens to me while I’m not aware of it. I could probably figure out whether or not pain is actually registered in my brain during having my upcoming operation under anesthesia, but the fact that I won’t bother tells me very clearly, that awareness of pain is an important weight we have to multiply in some fashion with the actual pain-registration in the brain.
That’s just an additional consideration though, even if we simplify it and imagine the pain is directly comparable and has no difference in quality at all, while the total quantity of pain is excessively higher in the dust-scenario compared to the torture-scenario, it changes nothing about my current choice.
So what does that tell me about the relationship between utility and morality? I don’t accept that morality is just about the total lump sums of utility and disutility, I think we also have to consider the distribution of those in any given population. Why is that I ask myself and my brain offers the following answer to this question:
If I was the only agent in the entire universe and had to pick torture vs dust for myself (and obviously if I was immortal/ had a long enough life to experience all those dust specks), I would still prefer the larger area under the curve over the pain-spike, even if I assume direct comparability of the two kinds of pain. I suspect the reason for this choice is a type of time-discounting my brain does, I’d rather suffer a little pain every day for a trillion years than a big spike for 50 years. Considering that briefly speaking utility is (or at least I think should be defined as) a thing that only results from the interaction of minds and environments, my mind and its workings are definitely part of the equation that says what has utility and what doesn’t. And my mind wants to suffer low disutility evenly distributed over a long time-period rather than suffer great disutility for a 50 year spike (assuming a trillion-year lifetime).
If you’re going to say that, you’ll need some threshhold, and pain over the threshhold makes the whole society count as worse than pain under the threshhold. This will mean that any number of people with pain X is better than one person with pain X + epsilon, where epsilon is very small but happens to push it over the threshhold.
Alternately, you could say that the disutility of pain gradually changes, but that has other problems. I suggest you read up on the repugnant conclusion ( http://plato.stanford.edu/entries/repugnant-conclusion/ )--depending on exactly what you mean, what you suggest is similar to the proposed solutions, which don’t really work.