Eliezer, to be clear, do you still think that 3^^^3 people having momentary eye irritations—from dust-specs—is worth torturing a single person for 50 years, or is there a possibility that you did the math incorrectly for that example?
No. I used a number large enough to make math unnecessary.
I specified the dust specks had no distant consequences (no car crashes etc.) in the original puzzle.
Unless the torture somehow causes Vast consequences larger than the observable universe, or the suicide of someone who otherwise would have been literally immortal, it doesn’t matter whether the torture has distant consequences or not.
I confess I didn’t think of the suicide one, but I was very careful to choose an example that didn’t involve actually killing anyone, because there someone was bound to point out that there was a greater-than-tiny probability that literal immortality is possible and would otherwise be available to that person.
So I will specify only that the torture does not have any lasting consequences larger than a moderately sized galaxy, and then I’m done. Nothing bound by lightspeed limits in our material universe can morally outweigh 3^^^3 of anything noticeable. You’d have to leave our physics to do it.
You know how some people’s brains toss out the numbers? Well, when you’re dealing with a number like 3^^^3 in a thought experiment, you can toss out the event descriptions. If the thing being multiplied by 3^^^3 is good, it wins. If the thing being multiplied by 3^^^3 is bad, it loses. Period. End of discussion. There are no natural utility differences that large.
Unless the torture somehow causes Vast consequences larger than the observable universe, or the suicide of someone who otherwise would have been literally immortal, it doesn’t matter whether the torture has distant consequences or not.
What about the consequences of the precedent set by the person making the decision that it is ok to torture an innocent person, in such circumstances? If such actions get officially endorsed as being moral, isn’t that going to have consequences which mean the torture won’t be a one-off event?
There’s a rather good short story about this, by Ursula K LeGuin:
If such actions get officially endorsed as being moral, isn’t that going to have consequences which mean the torture won’t be a one-off event?
Why would it?
And I don’t think LeGuin’s story is good—it’s classic LeGuin, by which I mean enthymematic, question-begging, emotive substitution for thought, which annoyed me so much that I wrote my own reply.
There is this habitation called Omelas in which things are pretty swell for everybody except one kid who is kept in lousy conditions; by unspecified mechanism this is necessary for things to be pretty swell for everybody else in Omelas. Residents are told about the kid when they are old enough. Some of them do not approve of the arrangement and emigrate.
There is this city called Acre where things are pretty swell except for this one guy who has a lousy job; by a well-specified mechanism, his job makes him an accessary to murders which preserve the swell conditions. He understands all this and accepts the overwhelmingly valid moral considerations, but still feels guilty—in any human paradise, there will be a flaw.
“Omelas” contrasts the happiness of the citizens with the misery of the child. I couldn’t tell from your story that the tradesman felt unusually miserable, nor that the other people of his city felt unusually happy. Nor do I know how this affects your reply to LeGuin, since I can’t detect the reply.
I thought it was pretty clear in the story. It’s not easy coming up with analogues to crypto, and there’s probably holes in my lock scheme, but good enough for a story.
Okay, and I imagine this would incentivize assassins, but how is this helping society be pretty swell for most people, and what is the one guy’s job exactly? (Can you not bet on the deaths of arbitrary people, only people it is bad to have around? Is the one guy supposed to determine who it’s bad to have around or something and only allow bets on those folks? How does he determine that, if so?)
but how is this helping society be pretty swell for most people, and what is the one guy’s job exactly?
Incentive to cooperate? A reduction in the necessity of war, which is by nature an inefficient use of resources? From the story:
The wise men of that city had devised the practice when it became apparent to them that the endless clashes of armies on battlefields led to no lasting conclusion, nor did they extirpate the roots of the conflicts. Rather, they merely wasted the blood and treasure of the people. It was clear to them that those rulers led their people into death and iniquity, while remaining untouched themselves, lounging in comfort and luxury amidst the most crushing defeat.
It was better that a few die before their time than the many. It was better that a little wealth go to the evil than much; better that conflicts be ended dishonorably once and for all, than fought honorably time and again; and better that peace be ill-bought than bought honestly at too high a price to be borne. So they thought.
Moving on.
(Can you not bet on the deaths of arbitrary people, only people it is bad to have around?
Nope, “badness” is determined by the market.
Is the one guy supposed to determine who it’s bad to have around or something and only allow bets on those folks? How does he determine that, if so?)
The “merchant of death” diffuses the legal culpability associated with betting on the assassination market. The tension in the narrative comes from him feeling ever so slightly morally culpable for the assassinations, even though he only “causes” them indirectly. Again from the story:
Through judicious use of an intermediary (the merchant of death), the predictor could make his prediction, pay the fee, and collect the reward while remaining unknown to all save one.
So, I have some questions: how could you actually make money from this? It seems like the idea is that people place bets on the date that they’re planning to assassinate the target themselves. So… where’s the rest of the money come from, previous failed attempts? I’m not sure that “A whole bunch of guys tried to assassinate the president and got horribly slaughtered for their trouble. That means killing him’d make me rich! Where’s my knife?” is a realistic train of thought.
The gamblers collect their winnings; the merchant of death charges a fee, presumably to compensate for the hypothetical legal liability and moral hazard. See the last quote from the story in grandparent.
It seems like the idea is that people place bets on the date that they’re planning to assassinate the target themselves. So… where’s the rest of the money come from, previous failed attempts?
Or they want someone else to become more motivated to assassinate the target.
I’m not sure that “A whole bunch of guys tried to assassinate the president and got horribly slaughtered for their trouble. That means killing him’d make me rich! Where’s my knife?” is a realistic train of thought.
It’s not, because that’s not how the information on how much a certain death is worth propagates. The assassination market needs to be at least semi-publicly observable—in the story’s case, the weight of the money in the named cylinder pulls it down, showing how much money is in the cylinder. If someone wanted a high-risk target, they’d have to offer more money to encourage the market to supply the service.
The assassination market needs to be at least semi-publicly observable—in the story’s case, the weight of the money in the named cylinder pulls it down, showing how much money is in the cylinder.
Ahh, that was the bit I missed. Okay, that makes sense now.
Edit: Upon rereading, I think this could perhaps be a bit clearer.
To one side of him were suspended cylinders. And each hung at a different height, held by oiled cords leading away into the depths. And upon each cylinder was inscribed a name. The merchant looked at one marked ‘Sammael’. A man he had never met, and never would.
Cylinders hung suspended, okay. Held by cords leading into the “depths”—what?
Into one of the holes by that particular cylinder, he dropped several heavy gold coins. Some time after their clinkings ceased to echo, the cylinder hoisted ever so slightly. Into the other hole he dropped a pouch containing: a parchment note listing a particular date, a fat coin in fee, and a stout lock.
Holes by that cylinder- presumably in the wall or floor? The money goes into the locked treasure room, not the cylinder. And it causes (somehow) the cylinder to rise, not fall.
Holes by that cylinder- presumably in the wall or floor? The money goes into the locked treasure room, not the cylinder. And it causes (somehow) the cylinder to rise, not fall.
The idea is that the room in the dungeons has two compartments which the two holes lead to: one contains the locks and predictions, and only the ‘winning’ lock is used when the person is assassinated (my offline analogue to crypto signatures), but the other just holds the money/rewards, and is actually a big cup or something held up by the cord which goes up to the ceiling, around a pulley, and then down to the cylinder. Hence, the more weight (money) inside the cup, the higher the cylinder is hoisted.
I guess ropes and pulleys are no longer so common these days as to make the setup clear and not requiring further explanation?
(This is one of the vulnerabilities as described—what’s to stop someone from dumping in some lead? As I said, real-world equivalents to crypto are hard. Probably this could be solved by bringing in another human weak point—eg. specifying that only the merchants are allowed to put money in.)
The described pulley setup will simply accelerate until it reaches one limit or the other depending on the balance of weights. In order to have the position vary with the load, you need a position-varying force, such as
A spring.
A rotating off-center mass, as in a balance scale. (This is nonlinear for large angles.)
An asymmetric pulley, i.e. a cam (in the shape of an Archimedean spiral).
A tall object (of constant cross-section) entering a pool of water.
For what it’s worth, some people read “Omelas” as being about a superstition that torturing a child is necessary (see the bit about good weather) rather than a situation where torturing a child is actually contributing to public welfare.
And the ‘wisdom of their scholars’ depends on the torture as well? ‘terms’ implies this is a magical contract of some sort. No mechanism, of course, like most magic and all of LeGuin’s magic that I’ve read (Earthsea especially).
It’s worth noting, for ‘number of people killed’ statistics, that all of those people were going to die anyway, and many of them might have been about to die for some other reason.
Society kills about 56 million people each year from spending resources on things other than solving the ‘death’ problem.
I really don’t see why I can’t say “the negative utility of a dust speck is 1 over Graham’s Number.” or “I am not obligated to have my utility function make sense in contexts like those involving 3^^^^3 participants, because my utility function is intended to be used in This World, and that number is a physical impossibility in This World.”
As a separate response, what’s wrong with this calculation: I base my judgments largely on the duration of the disutility. After 1 second, the dust specks disappear and are forgotten, and so their disutility also disappears. The same is not true of the torture; the torture is therefore worse. I can foresee some possible problems with this line of thought, but it’s 2:30 am in New Orleans and I just got done with a long evening of drinking and Joint Mathematics Meeting, so please forgive me if I don’t attempt to formalize it now.
An addendum: 2 more things. The difference between a life with n dust specks hitting your eye and n+1 dust specks is not worth considering, given how large n is in any real life. Furthermore, if we allow for possible immortality, n could literally be infinity, so the difference would be literally 0.
Secondly, by virtue of your asserting that there exists an action with minimal disutility, you’ve shown that the Field of Utility is very different from the field of, say, the Real numbers, and so I am incredulous that we can simply “multiply” in the usual sense.
I really don’t see why I can’t say “the negative utility of a dust speck is 1 over Graham’s Number.”
You can say anything, but Graham’s number is very large; if the disutility of an air molecule slamming into your eye were 1 over Graham’s number, enough air pressure to kill you would have negligible disutility.
or “I am not obligated to have my utility function make sense in contexts like those involving 3^^^^3 participants, because my utility function is intended to be used in This World, and that number is a physical impossibility in This World.”
If your utility function ceases to correspond to utility at extreme values, isn’t it more of an approximation of utility than actual utility? Sure, you don’t need a model that works at the extremes—but when a model does hold for extreme values, that’s generally a good sign for the accuracy of the model.
An addendum: 2 more things. The difference between a life with n dust specks hitting your eye and n+1 dust specks is not worth considering, given how large n is in any real life. Furthermore, if we allow for possible immortality, n could literally be infinity, so the difference would be literally 0.
If utility is to be compared relative to lifetime utility, i.e. as (LifetimeUtility + x / LifetimeUtility), doesn’t that assign higher impact to five seconds of pain for a twenty-year old who will die at 40 than to a twenty-year old who will die at 120? Does that make sense?
Secondly, by virtue of your asserting that there exists an action with minimal disutility, you’ve shown that the Field of Utility is very different from the field of, say, the Real numbers, and so I am incredulous that we can simply “multiply” in the usual sense.
Eliezer’s point does not seem to me predicated on the existence of such a value; I see no need to assume multiplication has been broken.
if the disutility of an air molecule slamming into your eye were 1 over Graham’s number, enough air pressure to kill you would have negligible disutility.
Yes, this seems like a good argument that we can’t add up disutility for things like “being bumped into by particle type X” linearly. In fact, it seems like having 1, or even (whatever large number I breathe in a day) molecules of air bumping into me is a good thing, and so we can’t just talk about things like “the disutility of being bumped into by kinds of particles”.
If your utility function ceases to correspond to utility at extreme values, isn’t it more of an approximation of utility than actual utility?
Yeah, of course. Why, do you know of some way to accurately access someone’s actually-existing Utility Function in a way that doesn’t just produce an approximation of an idealization of how ape brains work? Because me, I’m sitting over here using an ape brain to model itself, and this particular ape doesn’t even really expect to leave this planet or encounter or affect more than a few billion people, much less 3^^^3. So it’s totally fine using something accurate to a few significant figures, trying to minimize errors that would have noticeable effects on these scales.
Sure, you don’t need a model that works at the extremes—but when a model does hold for extreme values, that’s generally a good sign for the accuracy of the model.
Yes, I agree. Given that your model is failing at these extreme values and telling you to torture people instead of blink, I think that’s a bad sign for your model.
doesn’t that assign higher impact to five seconds of pain for a twenty-year old who will die at 40 than to a twenty-year old who will die at 120? Does that make sense?
If it were that many dust specs in one person’s eye, then the 50 years of torture would be reasonable, but getting dust specs in your eye doesn’t cause lasting trauma, and it doesn’t cause trauma to the people around you. Graham’s number is big, yes, but all these people will go about their lives as if nothing happened afterwards- won’t they? I feel like if someone were to choose torture for more than half a person’s life for one person over everyone having a minor discomfort for a few moments, and everyone knew that the person had made the choice, everyone who knew would probably want absolutely nothing to do with that person.
I feel like the length of the discomfort and how bad the discomfort is, ends up outweighing the number of times it happens, as long as it happens to different people and not the same person. The torture would have lasting consequences as well, and the dust specs wouldn’t. I get your point and all, but I feel like dust specs compared to torture was a bad thing to use as an example.
Eliezer, to be clear, do you still think that 3^^^3 people having momentary eye irritations—from dust-specs—is worth torturing a single person for 50 years, or is there a possibility that you did the math incorrectly for that example?
No. I used a number large enough to make math unnecessary.
I specified the dust specks had no distant consequences (no car crashes etc.) in the original puzzle.
Unless the torture somehow causes Vast consequences larger than the observable universe, or the suicide of someone who otherwise would have been literally immortal, it doesn’t matter whether the torture has distant consequences or not.
I confess I didn’t think of the suicide one, but I was very careful to choose an example that didn’t involve actually killing anyone, because there someone was bound to point out that there was a greater-than-tiny probability that literal immortality is possible and would otherwise be available to that person.
So I will specify only that the torture does not have any lasting consequences larger than a moderately sized galaxy, and then I’m done. Nothing bound by lightspeed limits in our material universe can morally outweigh 3^^^3 of anything noticeable. You’d have to leave our physics to do it.
You know how some people’s brains toss out the numbers? Well, when you’re dealing with a number like 3^^^3 in a thought experiment, you can toss out the event descriptions. If the thing being multiplied by 3^^^3 is good, it wins. If the thing being multiplied by 3^^^3 is bad, it loses. Period. End of discussion. There are no natural utility differences that large.
What about the consequences of the precedent set by the person making the decision that it is ok to torture an innocent person, in such circumstances? If such actions get officially endorsed as being moral, isn’t that going to have consequences which mean the torture won’t be a one-off event?
There’s a rather good short story about this, by Ursula K LeGuin:
The Ones Who Walk Away From Omelas
Why would it?
And I don’t think LeGuin’s story is good—it’s classic LeGuin, by which I mean enthymematic, question-begging, emotive substitution for thought, which annoyed me so much that I wrote my own reply.
I’ve read your story three times now and still don’t know what’s going on in it. Can I have it in the form of an explanation instead of a story?
Sure, but you’ll first have to provide an explanation of LeGuin’s.
There is this habitation called Omelas in which things are pretty swell for everybody except one kid who is kept in lousy conditions; by unspecified mechanism this is necessary for things to be pretty swell for everybody else in Omelas. Residents are told about the kid when they are old enough. Some of them do not approve of the arrangement and emigrate.
Something of this form about your story will do.
There is this city called Acre where things are pretty swell except for this one guy who has a lousy job; by a well-specified mechanism, his job makes him an accessary to murders which preserve the swell conditions. He understands all this and accepts the overwhelmingly valid moral considerations, but still feels guilty—in any human paradise, there will be a flaw.
“Omelas” contrasts the happiness of the citizens with the misery of the child. I couldn’t tell from your story that the tradesman felt unusually miserable, nor that the other people of his city felt unusually happy. Nor do I know how this affects your reply to LeGuin, since I can’t detect the reply.
Since the mechanism is well-specified, can you specify it?
I thought it was pretty clear in the story. It’s not easy coming up with analogues to crypto, and there’s probably holes in my lock scheme, but good enough for a story.
Please explain it anyway.
(It never goes well for me when I reply to this sort of thing with snark. So I edited away a couple of drafts of snark.)
It’s a prediction market where the predictions (that we care about, anyway) are all of the form “I bet X that Y will die on date Z.”
Okay, and I imagine this would incentivize assassins, but how is this helping society be pretty swell for most people, and what is the one guy’s job exactly? (Can you not bet on the deaths of arbitrary people, only people it is bad to have around? Is the one guy supposed to determine who it’s bad to have around or something and only allow bets on those folks? How does he determine that, if so?)
Everything you’d want to know about assassination markets.
Incentive to cooperate? A reduction in the necessity of war, which is by nature an inefficient use of resources? From the story:
Moving on.
Nope, “badness” is determined by the market.
The “merchant of death” diffuses the legal culpability associated with betting on the assassination market. The tension in the narrative comes from him feeling ever so slightly morally culpable for the assassinations, even though he only “causes” them indirectly. Again from the story:
I think I get it. I have worldbuilding disagreements with this but am no longer bewildered. Thank you!
So, I have some questions: how could you actually make money from this? It seems like the idea is that people place bets on the date that they’re planning to assassinate the target themselves. So… where’s the rest of the money come from, previous failed attempts? I’m not sure that “A whole bunch of guys tried to assassinate the president and got horribly slaughtered for their trouble. That means killing him’d make me rich! Where’s my knife?” is a realistic train of thought.
The gamblers collect their winnings; the merchant of death charges a fee, presumably to compensate for the hypothetical legal liability and moral hazard. See the last quote from the story in grandparent.
Or they want someone else to become more motivated to assassinate the target.
It’s not, because that’s not how the information on how much a certain death is worth propagates. The assassination market needs to be at least semi-publicly observable—in the story’s case, the weight of the money in the named cylinder pulls it down, showing how much money is in the cylinder. If someone wanted a high-risk target, they’d have to offer more money to encourage the market to supply the service.
Ahh, that was the bit I missed. Okay, that makes sense now.
Edit: Upon rereading, I think this could perhaps be a bit clearer.
Cylinders hung suspended, okay. Held by cords leading into the “depths”—what?
Holes by that cylinder- presumably in the wall or floor? The money goes into the locked treasure room, not the cylinder. And it causes (somehow) the cylinder to rise, not fall.
The idea is that the room in the dungeons has two compartments which the two holes lead to: one contains the locks and predictions, and only the ‘winning’ lock is used when the person is assassinated (my offline analogue to crypto signatures), but the other just holds the money/rewards, and is actually a big cup or something held up by the cord which goes up to the ceiling, around a pulley, and then down to the cylinder. Hence, the more weight (money) inside the cup, the higher the cylinder is hoisted.
I guess ropes and pulleys are no longer so common these days as to make the setup clear and not requiring further explanation?
(This is one of the vulnerabilities as described—what’s to stop someone from dumping in some lead? As I said, real-world equivalents to crypto are hard. Probably this could be solved by bringing in another human weak point—eg. specifying that only the merchants are allowed to put money in.)
The described pulley setup will simply accelerate until it reaches one limit or the other depending on the balance of weights. In order to have the position vary with the load, you need a position-varying force, such as
A spring.
A rotating off-center mass, as in a balance scale. (This is nonlinear for large angles.)
An asymmetric pulley, i.e. a cam (in the shape of an Archimedean spiral).
A tall object (of constant cross-section) entering a pool of water.
For what it’s worth, some people read “Omelas” as being about a superstition that torturing a child is necessary (see the bit about good weather) rather than a situation where torturing a child is actually contributing to public welfare.
And the ‘wisdom of their scholars’ depends on the torture as well? ‘terms’ implies this is a magical contract of some sort. No mechanism, of course, like most magic and all of LeGuin’s magic that I’ve read (Earthsea especially).
America kills 20,000 people/yr via air pollution.. Are you ready to walk away?
It’s worth noting, for ‘number of people killed’ statistics, that all of those people were going to die anyway, and many of them might have been about to die for some other reason.
Society kills about 56 million people each year from spending resources on things other than solving the ‘death’ problem.
Some of whom several decades later. (Loss of QALYs would be a better statistic, and I think it would be non-negligible.)
I really don’t see why I can’t say “the negative utility of a dust speck is 1 over Graham’s Number.” or “I am not obligated to have my utility function make sense in contexts like those involving 3^^^^3 participants, because my utility function is intended to be used in This World, and that number is a physical impossibility in This World.”
As a separate response, what’s wrong with this calculation: I base my judgments largely on the duration of the disutility. After 1 second, the dust specks disappear and are forgotten, and so their disutility also disappears. The same is not true of the torture; the torture is therefore worse. I can foresee some possible problems with this line of thought, but it’s 2:30 am in New Orleans and I just got done with a long evening of drinking and Joint Mathematics Meeting, so please forgive me if I don’t attempt to formalize it now.
An addendum: 2 more things. The difference between a life with n dust specks hitting your eye and n+1 dust specks is not worth considering, given how large n is in any real life. Furthermore, if we allow for possible immortality, n could literally be infinity, so the difference would be literally 0.
Secondly, by virtue of your asserting that there exists an action with minimal disutility, you’ve shown that the Field of Utility is very different from the field of, say, the Real numbers, and so I am incredulous that we can simply “multiply” in the usual sense.
You can say anything, but Graham’s number is very large; if the disutility of an air molecule slamming into your eye were 1 over Graham’s number, enough air pressure to kill you would have negligible disutility.
If your utility function ceases to correspond to utility at extreme values, isn’t it more of an approximation of utility than actual utility? Sure, you don’t need a model that works at the extremes—but when a model does hold for extreme values, that’s generally a good sign for the accuracy of the model.
If utility is to be compared relative to lifetime utility, i.e. as (LifetimeUtility + x / LifetimeUtility), doesn’t that assign higher impact to five seconds of pain for a twenty-year old who will die at 40 than to a twenty-year old who will die at 120? Does that make sense?
Eliezer’s point does not seem to me predicated on the existence of such a value; I see no need to assume multiplication has been broken.
Yes, this seems like a good argument that we can’t add up disutility for things like “being bumped into by particle type X” linearly. In fact, it seems like having 1, or even (whatever large number I breathe in a day) molecules of air bumping into me is a good thing, and so we can’t just talk about things like “the disutility of being bumped into by kinds of particles”.
Yeah, of course. Why, do you know of some way to accurately access someone’s actually-existing Utility Function in a way that doesn’t just produce an approximation of an idealization of how ape brains work? Because me, I’m sitting over here using an ape brain to model itself, and this particular ape doesn’t even really expect to leave this planet or encounter or affect more than a few billion people, much less 3^^^3. So it’s totally fine using something accurate to a few significant figures, trying to minimize errors that would have noticeable effects on these scales.
Yes, I agree. Given that your model is failing at these extreme values and telling you to torture people instead of blink, I think that’s a bad sign for your model.
Yeah, absolutely, I definitely agree with that.
That would be failing, but 3^^^3 people blinking != you blinking. You just don’t comprehend the size of 3^^^3.
Well it’s self evident that that’s silly. So, there’s that.
Please don’t build a machine that will torture me to save you from dust specks.
If it were that many dust specs in one person’s eye, then the 50 years of torture would be reasonable, but getting dust specs in your eye doesn’t cause lasting trauma, and it doesn’t cause trauma to the people around you. Graham’s number is big, yes, but all these people will go about their lives as if nothing happened afterwards- won’t they? I feel like if someone were to choose torture for more than half a person’s life for one person over everyone having a minor discomfort for a few moments, and everyone knew that the person had made the choice, everyone who knew would probably want absolutely nothing to do with that person.
I feel like the length of the discomfort and how bad the discomfort is, ends up outweighing the number of times it happens, as long as it happens to different people and not the same person. The torture would have lasting consequences as well, and the dust specs wouldn’t. I get your point and all, but I feel like dust specs compared to torture was a bad thing to use as an example.
How confident are you that physics has anything to do with morality?