Understanding and overcoming human cognitive biases is, of course, a recurring theme here. So is management of catastrophic (including existential) risks.
Discussions of charity come up from time to time, usually framed as optimization problems. This post gets cited often. We actually had a recent essay contest on efficient charity that might interest you.
The value of religion (as distinct from the value of charity, of community, and so forth) comes up from time to time but rarely goes anywhere useful.
Don’t sweat the karma.
If you don’t mind a personal question: where did you and your husband get married?
We got married in a small town near St. Catharine’s, Ontario, a few weeks after it became legal there.
Thanks for the charity links. I find practical and aesthetic value in the challenging aspect of “shut up and multiply,”(http://lesswrong.com/lw/n3/circular_altruism/), particularly in the example you linked about purchasing charity efficiently. However, it seems to me that oversimplification can occur when we talk about human suffering.
(Please forgive me if the following is rehashing something written earlier.) For example, multiplying a billion people’s suffering for 1 second to make it equal to a billion seconds of consecutive suffering to make it seem way more bad than a million consecutive seconds—almost 12 straight days—of suffering done by one person is just plainly, rationally wrong. One proof of that is that distributing those million seconds as one-second bursts at regular intervals over a person’s life is better than the million consecutive seconds because the person is not otherwise unduly hampered by the occasional one-second annoyances, but would probably become unable to function well in the consecutive case, and might be permanently injured (a la PTSD). My point is there’s something missing from the equation, and that potential lies at the heart of the human impulse to be irrational when presented with the same choice as comparative gain vs. comparative loss.
As you say, a million isolated seconds of suffering isn’t as bad as a million consecutive seconds of suffering, because (among other things) of the knock-on effects of consecutivity (e.g. PTSD). Maybe it’s only 10% as bad, or 1%, or .1%, or .0001%, or whatever. Sure, agreed, of course.
But the moral intuition being challenged by “shut up and multiply” isn’t about that.
If everyone agreed that sure, N dust-specks was worse than 50 years of torture for some N, and we were merely haggling over the price, the thought experiment would not be interesting. That’s why the thought experiment involves ridiculous numbers like 3^^^3 in the first place, so we can skip over all that.
When we’re trying to make practical decisions about what suffering to alleviate, we care about N, and precision matters. At that point we have to do some serious real-world thinking and measuring and, y’know, work.
But what’s challenging about “shut up and multiply” isn’t the value of N, it’s the existence of N. if we’re starting out with a moral intuition that dust-specks and torture simply aren’t commensurable, and therefore there is no value of N… well, then the work of calculating it is doomed before we start.
OK, I now understand the way the site works: If someone responds to your comment, it shows up in your mailbox like an e-mail. Sorry for getting that wrong with Vaniver ( i responded by private mail), and if I can fix it in a little while, I will (edit: and now I have). Now, to content:
Thanks for responding to me! I didn’t feel like I should hijack the welcome thread for something I didn’t know hadn’t been thoroughly discussed elsewhere. So I tried to be succinct, and failed and ended up garbled.
First, 3^^^3 is WAY more than a googolplex ;-)
Second, I fully recognize the existence of N, and I tried to make that clear in the last statement of content-value in my answer to you, by recalling the central lesson of “shut up and multiply”, which is that people, when faced with identical situations presented at one time as gain comparisons, and at another time as loss comparisons, will fail to recognize the identity and choose differently. That is a REALLY useful thing to know about human bias, and I don’t discount it.
I suppose my comment above amounts to a quibble if it’s already understood that EY’s ideas only apply to identical situations presented with different gain/loss values, but I don’t have the impression that’s all he was getting at. Hence, my caveat. If everyone’s already beyond that, feel free to ignore.
I agree that dust-specks and torture are commensurable. If you will allow, a personal story:
I have distichiasis. Look it up, it ain’t fun. My oily tear glands, on the insides of my eyelids, produce eyelashes that grow toward my eyes. Every once in a while, one of those (almost invisible, clear—mine rarely have pigment at all) eyelashes grows long enough to brush my eyes. At that instant, I rarely notice, having been inured to the sensation. I only respond when the lash is long enough to wake me up in the middle of the night, and I struggle to pull out the invisible eyelash. Sometimes, rarely, it gets just the right (wrong) length when I’m driving, and I clap my hand over my eye to hold it still until I get home.
If I could reliably relieve myself of this condition in exchange for one full day of hot stinging torture, I would do so, as long as I could schedule it conveniently, because I could then get LASIK, which distichiasis strictly disallows for me stasis quo. I even tried, with electrolysis, which burned and scarred my eyelids enough that the doctor finally suggested I’d better stop.
So, an individual’s choices about how they will consume their lot of torture can be wide-ranging. I recognize that. These calculations of EY’s do not recognize these differences. Sometimes, it makes sense to shut up and multiply. Other times, when it’s available (as it often is), it makes sense to shut up and listen. Because of that inherent fact, of the difference between internal perception and others’ external perception of your suffering, we have a really useful intuition built in to, in otherwise equal situations, defer to the judgment of those who will suffer. We optimize not over suffering, but over choice. That is our human nature. It may be irrational. But, that nature should be addressed—not only failing to multiply human suffering sufficiently objectively.
Welcome!
Understanding and overcoming human cognitive biases is, of course, a recurring theme here. So is management of catastrophic (including existential) risks.
Discussions of charity come up from time to time, usually framed as optimization problems. This post gets cited often. We actually had a recent essay contest on efficient charity that might interest you.
The value of religion (as distinct from the value of charity, of community, and so forth) comes up from time to time but rarely goes anywhere useful.
Don’t sweat the karma.
If you don’t mind a personal question: where did you and your husband get married?
We got married in a small town near St. Catharine’s, Ontario, a few weeks after it became legal there.
Thanks for the charity links. I find practical and aesthetic value in the challenging aspect of “shut up and multiply,”(http://lesswrong.com/lw/n3/circular_altruism/), particularly in the example you linked about purchasing charity efficiently. However, it seems to me that oversimplification can occur when we talk about human suffering.
(Please forgive me if the following is rehashing something written earlier.) For example, multiplying a billion people’s suffering for 1 second to make it equal to a billion seconds of consecutive suffering to make it seem way more bad than a million consecutive seconds—almost 12 straight days—of suffering done by one person is just plainly, rationally wrong. One proof of that is that distributing those million seconds as one-second bursts at regular intervals over a person’s life is better than the million consecutive seconds because the person is not otherwise unduly hampered by the occasional one-second annoyances, but would probably become unable to function well in the consecutive case, and might be permanently injured (a la PTSD). My point is there’s something missing from the equation, and that potential lies at the heart of the human impulse to be irrational when presented with the same choice as comparative gain vs. comparative loss.
As you say, a million isolated seconds of suffering isn’t as bad as a million consecutive seconds of suffering, because (among other things) of the knock-on effects of consecutivity (e.g. PTSD). Maybe it’s only 10% as bad, or 1%, or .1%, or .0001%, or whatever. Sure, agreed, of course.
But the moral intuition being challenged by “shut up and multiply” isn’t about that.
If everyone agreed that sure, N dust-specks was worse than 50 years of torture for some N, and we were merely haggling over the price, the thought experiment would not be interesting. That’s why the thought experiment involves ridiculous numbers like 3^^^3 in the first place, so we can skip over all that.
When we’re trying to make practical decisions about what suffering to alleviate, we care about N, and precision matters. At that point we have to do some serious real-world thinking and measuring and, y’know, work.
But what’s challenging about “shut up and multiply” isn’t the value of N, it’s the existence of N. if we’re starting out with a moral intuition that dust-specks and torture simply aren’t commensurable, and therefore there is no value of N… well, then the work of calculating it is doomed before we start.
OK, I now understand the way the site works: If someone responds to your comment, it shows up in your mailbox like an e-mail. Sorry for getting that wrong with Vaniver ( i responded by private mail), and if I can fix it in a little while, I will (edit: and now I have). Now, to content:
Thanks for responding to me! I didn’t feel like I should hijack the welcome thread for something I didn’t know hadn’t been thoroughly discussed elsewhere. So I tried to be succinct, and failed and ended up garbled.
First, 3^^^3 is WAY more than a googolplex ;-)
Second, I fully recognize the existence of N, and I tried to make that clear in the last statement of content-value in my answer to you, by recalling the central lesson of “shut up and multiply”, which is that people, when faced with identical situations presented at one time as gain comparisons, and at another time as loss comparisons, will fail to recognize the identity and choose differently. That is a REALLY useful thing to know about human bias, and I don’t discount it.
I suppose my comment above amounts to a quibble if it’s already understood that EY’s ideas only apply to identical situations presented with different gain/loss values, but I don’t have the impression that’s all he was getting at. Hence, my caveat. If everyone’s already beyond that, feel free to ignore.
I agree that dust-specks and torture are commensurable. If you will allow, a personal story: I have distichiasis. Look it up, it ain’t fun. My oily tear glands, on the insides of my eyelids, produce eyelashes that grow toward my eyes. Every once in a while, one of those (almost invisible, clear—mine rarely have pigment at all) eyelashes grows long enough to brush my eyes. At that instant, I rarely notice, having been inured to the sensation. I only respond when the lash is long enough to wake me up in the middle of the night, and I struggle to pull out the invisible eyelash. Sometimes, rarely, it gets just the right (wrong) length when I’m driving, and I clap my hand over my eye to hold it still until I get home.
If I could reliably relieve myself of this condition in exchange for one full day of hot stinging torture, I would do so, as long as I could schedule it conveniently, because I could then get LASIK, which distichiasis strictly disallows for me stasis quo. I even tried, with electrolysis, which burned and scarred my eyelids enough that the doctor finally suggested I’d better stop.
So, an individual’s choices about how they will consume their lot of torture can be wide-ranging. I recognize that. These calculations of EY’s do not recognize these differences. Sometimes, it makes sense to shut up and multiply. Other times, when it’s available (as it often is), it makes sense to shut up and listen. Because of that inherent fact, of the difference between internal perception and others’ external perception of your suffering, we have a really useful intuition built in to, in otherwise equal situations, defer to the judgment of those who will suffer. We optimize not over suffering, but over choice. That is our human nature. It may be irrational. But, that nature should be addressed—not only failing to multiply human suffering sufficiently objectively.