I am not in a financial position to engage in philanthropy.
Of course you are, you just gave away $200. Good grief.
Not to pick on you… Well actually yes, to pick on you: What the hell is wrong with you people? If this were religious-oriented—for a pilgrimage to Mecca or buy Mormon underpants or pay for one last course of Scientology auditing—you’d be laughing your ass off hysterically! But because it’s cryonics...
How could you fail and compartmentalize so epically? This is like, fractally bad: at every level, donating is a bad idea. It’s probably a scam, so donating is a bad idea; if it weren’t a scam, you still have no idea what she would really do with it or how close to the cryonics fee she’d come, so donating is a bad idea; even if she would collect enough, donating to ALCOR or the Brain Preservation Prize is a better idea; even if you wanted to donate to them, they’re still almost certainly not as good as Givewell’s best charity; and so on.
A scam like this is hard to expose if done right. How would you expose it in the absence of specific actionable information like a name, whose omission can be justified on privacy grounds? (Even a frontal photograph would give you little to go on unless you were lucky.)
As NancyLebovitz asked MileyCyrus… would you care to put odds on this being a scam? I see three broad outcomes: exposed as scam, confirmed by CI as not a scam (or similar strong evidence), and something inconclusive. A prediction on all three options (or more, if you see interesting ones) would be nice, or feel free to lump any two together.
I’ll give an informal prediction: 5% of scam exposed, more likely confirmed by CI than not.
Before Hank’s claim about the Facebook profile (for which I will take his word), I would’ve given 75% chance of an underlying fraud. With the years of Facebooking up front, I’d revise downwards to 5-10%. Will CI confirm or will everyone be left hanging? That’s harder… She might just be too lazy or events could intervene. I don’t have a good guess about that sort of thing, so I’d too only give around >60% odds of CI confirming her.
Still hoping she might get some cryo-financial assistance from her Christian family could have inhibited her from making her identity completely transparent. Becoming a cryonics cause-celebre could also be the last straw in her relations with her parents.
So, let’s take bets, given those odds: 5% chance of a confirmed scam, 50% chance confirmed not a scam, and 45% chance that evidence is inconclusive.
So, you would gladly accept cash bets where you lose $5 if it is confirmed not a scam and gain $1 if it confirmed a scam. Anyone willing to bet against those odds?
I would take a bet where I lose $5 if it is confirmed a scam and gain $1 if it is confirmed not a scam. (I’m assuming the bet is off if there is no confirmation either way.)
At $5 vs $1, I’m not sure the bet is worth the hassle. Make it $25 vs $5, and I’m in. I’m happy to work with payment via paypal or mailed cash or check.
Myself, I judge the odds differently: I’d call it about a 20% chance of a scam, but a .2% chance of what I would call confirmation of that; and about an 80% chance that it is confirmed not to be a scam (such as by a independent verification from the CI. I can’t describe the criteria which, if met, would result in a inconclusive result, so I won’t take any bets on that outcome; I also won’t take the small bet, and can’t afford the large bet.
Assuming this is not a scam, I would donate for practical reasons (and not only fuzzies) - for those who plan to be frozen, we want cryonics to be popular. A public incident like this might make it into news, etc., and make a difference. Plus Reddit has gotten quite big.
There are a lot of things I’d like to say, but you have put forth a prediction
It’s probably a scam
I would like to take up a bet with you on this ending up being a scam. This can be arbitrated by some prominent member of CI, Alcor, or Rudi Hoffman. I would win if an arbiter decides that the person who posted on Reddit was in fact diagnosed with cancer essentially as stated in her Reddit posts, and is in fact gathering money for a her own cryonics arrangements. If none of the proposed arbiters can vouch for the above within one month (through September 18), then you will win the bet.
What odds would you like on this, and what’s the maximum amount of money you’d put on the line?
As I said in my other comment, I’m now giving 5-10% for scam. I’d be happy to take a 1:10 bet on the CI outcome risking no more than $100 on my part, but I think 1 month is much too tight; 1 January 2013 would be a better deadline with the bet short-circuiting on CI judgment.
Wait, I’m not sure we’re understanding each other. I thought I was putting up $100, and you’d put up $10; if she turned out to be a scam (as judged by CI), I lose the $100 to you—while if she turned out to be genuine (as judged by CI), you would lose the $10 to me.
Yes, that’s my take too… I’m not sure what mtaran is doing here—maybe he doesn’t care at all about the odds or which side he’s taking? I don’t mind betting, but I do insist that—when it’s for real sums of money like $100 - that my counterparty knows what he’s agreeing to. And I’m not sure mtaran does.
For what it’s worth, I too haven’t understood what side you are taking. Usually people bet on outcomes which they think are more probable than their opponent thinks.
I spent way too much time thinking about the same thing. It seems to me, if mtaran believes chance of scam is lower than Gwern’s, they should both agree that Gwern take the larger payout for smaller chance of being correct about it being a scam.
For what it’s worth I do not think that mtaran’s original bet is was that great for Gwern to begin with. A $1,000 to $100 bet implies odds of scam is at 9.1%, however Gwern stated his probability is 5%-10% putting the odds at the higher end of Gwern’s estimate. If Mtaran is truly confident he needs to offer at least $1,900 to $100 payout for as this will match Gwern’s lowest percentage for being a scam (5%).
Ok, I misread one of gwern’s replies. My original intent was to extract money from the fact that gwern gave (from my vantage point) too high a probability of this being a scam.
Under my original version of the terms, if his P(scam) was .1:
he would expect to get $1000 .1 of the time
he would expect to lose $100 .9 of the time
yielding an expected value of $10
Under my original version of the terms, if his P(scam) was .05:
he would expect to get $1000 .05 of the time
he would expect to lose $100 .95 of the time
yielding an expected value of -$45
In the second case, he would of course not want to take that bet. I’d thus like to amend my suggested conditions to have gwern only put $52 at stake against my $1000. For any P(scam) > .05 this is a positive expected value, so I would expect it to have been satisfactory to gwern[19 August 2012 01:53:58AM].
on 1 January 2013, if CI confirms that she is really dying and has or is in the process of signing up with membership & life insurance, then I will pay you $52; if they confirm the opposite, confirm nothing, or she has made no progress, you will pay me $1000.
In case of a dispute, another LWer can adjudicate; I nominate Eliezer, Carl Shulman, Luke, or Yvain (I haven’t asked but I doubt all would decline).
For me, paying with Paypal is most convenient, but if it isn’t for you we can arrange something else (perhaps you’d prefer I pay the $52 to a third party or charity). I can accept Paypal or Bitcoin.
Oh right; I’m so used to cryonics funding always being life insurance I forgot it wouldn’t apply here. Hm… I was trying for some sort of verification of financial commitment to being preserved. Suggestions on wording?
From a hedonistic point of view, what would you propose to be a better way to buy warm fuzzies for those already moved by and emotionally invested in the story?
Maybe a charity specializing in Africans which will send you pictures of little kids? Another option might be to go to the local pound, play with some of the kittens, and then donate; if warm fuzzy kittens and cats don’t get you warm fuzzies, I dunno what will!
(Best of course would be to not fall for the trap in the first place.)
Maybe a charity specializing in Africans which will send you pictures of little kids?
You’re missing the similarity drive. Pictures of smiling Africans is different from “this girl thinks like me”- the former are just kindred bodies, the latter are kindred spirits.
Not to fall into the “trap” of buying warm fuzzies? Do you advocate a policy of never buying yourself any warm fuzzies, or just of never buying warm fuzzies specifically through donating to charity (because it’s easy to trick your brain into believing it just did good)?
Yes, I am deeply suspicious of Eliezer’s post on warm fuzzies vs utilons because while I accept that it can be a good strategy, I am skeptical that it actually is: my suspicion is that for pretty much all people, buying fuzzies just crowds out buying utilons.
For example, I asked Konkvistador on IRC, since he was planning on buying fuzzies by donating to this person, what utilons he was planning on buying, especially since he had just mentioned he had very little money to spare. He replied with something about not eating ice cream and drinking more water.
He replied with something about not eating ice cream and drinking more water.
I was going for how this increase in fuzzy spending would be counteracted by me specifically cutting out fuzzy spending elsewhere, so total fuzzy spending remains unchanged by this particular decision.
So now you’re passing the buck twice: you’re passing the buck from donating to her to actually cutting down on the ice cream, and from there, you’re passing the buck to having increased hedons to at some point buying more utilons. Do you see why this sort of reasoning makes me go all -_- ?
Also me loosing weight does bring me utility.
More rationalizing doesn’t make me feel better either!
More rationalizing doesn’t make me feel better either!
You’ve shamed me not currently donating much to optimal charity. This has caused me to want to lower my current warm fuzzies spending increasing my available financial resources and give more to SIAI (which I currently think is the optimal charity). Thus I’ve decided to cut the last thing on the list of stuff I enjoy: Ice cream and sweets. The loosing weight comment was indeed searching for a silver lining to cutting it off my list.
If you hadn’t shamed me I’d still be happily enjoying my large supply of warm fuzzies from ice cream and other things, but you’ve unfortunately devalued it depriving me of that fun. That’s probably worth it for you since its more than offset by gains in other people.
I am however pretty bothered by how worked up some people where getting over this. I’m pretty certain that if I shared a good deal for buying a huge collection of anime on LessWrong and said that some people might find this valuable and they should totally check it out, no one would bring up optimal charity. I explicitly put this in the same bin as buying anime yet people attack it nevertheless.
Do you see why this sort of reasoning makes me go all -_- ?
I can see why.
Looking at myself from the outside I think I’m clearly defending an emotional attachment to a course of action. But I don’t remember deceiving myself or anyone else into thinking this was something worth doing on efficient charity grounds for the good of the world. This was framed as personal expenditures spending and talking about such spending. We do this all the time, check out the fictional book and other media recommendation threads we have running. I don’t see optimal charity advocates stomping there haranguing people for consuming such media. Indeed I can take this comparison further in this case and point out she is producing media (videos, writing), lots of people consume media that is available for free yet still donate to the authors for creating such content.
I didn’t even expect as many people as have to donate or feel inclined to donate on LessWrong, I was actually hoping more for people here to go and engage the pro-death arguments on reddit than donate themselves. I also made the argument that efficient charity advocacy in that thread (thought perhaps not here) won’t result in more efficient charity, but anti-cryonics or pro-death memes might be impacted in a way we’d all like.
Not sure if your suggestion would work at this stage. This dying (assuming it’s not a scam) person is already real and embedded in their hearts, especially if they read her older posts. They would have to pick cute kittens or sad pictures over a cancer girl, not an obvious decision. Like Murder-Gandhi, they have been irreversibly changed and would require a sobering pill to snap out of it, which they would probably refuse to take.
Assuming her story is not a scam, ponder why I find the idea of donating for cute kittens instead of helping another human being facing death and begging for help repugnant.
I have; now please ponder why I might find repugnant the idea of donating towards something as inefficient and low-probability as cryonics rather than the very high probability charities identified by GiveWell, based solely on some identity politics and a Reddit post.
If everyone is going to justify donating to her on fuzzies, then have the guts to defend fuzzies. Fuzzies are not a good way of helping human beings ‘facing death’: that’s the point. Don’t equivocate between arguing that donating to her is a good way of making you feel better, and arguing that donating to her is a utilitarianly optimal sort of donation.
You make an interesting assumption that we care about other people in general. If you assume that we model the human species as a group of people with the bell curve split fifty percent above and below the zero value line symmetrically, then it’s perfectly rational to give only to people who are familiar enough with to rank in the positive half.
Note: I do not believe this.
Also, if you actually believe in optimal charity for utilitarian reasons, then abusing people for sub-optimal charity is ridiculous. It does not make them more likely to engage in optimal charity, it makes them more likely not to engage in charity at all. You’re shooting your cause in the foot at least as much as they are.
It does not make them more likely to engage in optimal charity, it makes them more likely not to engage in charity at all.
It may make them overall less likely to engage in charity, yes, but if they do, it also makes them more likely to engage in optimal charity*. Since optimal charity is something like 2-3 orders of magnitude better than this particular instance of fuzzy charity, I should be willing to cause a lot of overall drops in charity in exchange for diverting a small fraction of that to an optimal charity.
* If it doesn’t even do that, though, then I have some serious problems on my hand.
The next time people are presented with an opportunity for charity(any opportunity), their last memory is now changed from ‘hey, I was charitable a couple of months ago, and that was nice’ to ‘hey, I was charitable a couple of months ago, and this optimal-charity jerk made me feel terrible about it.’
You’re making them less likely to give in general, and, by being rude about it, you’re also damaging the PR brand of your cause, which will hurt you more than you think. I don’t know of any corporation that advertises its product by abusing its customers.
This is likely to be the case if gwern were to act in such a way in the vast majority of environments. However, in this particular online community, criticizing people for publicly donating to suboptimal charity may well be a fairly good method for gwern to produce utilons.
Indeed. Consistent with this situational point, I also recently advisednot attempting to go over the The Oatmeal and related forums and evangelizing for optimal charity.
I have. You know what, you’re perfectly right, there are better ways to help people, and that’s even if you’re selfish and wish to help groups in which you’re likely to find yourself, for instance setting a precedent of people helping needy, terminally ill cryonics patients because “someday I could be in her shoes”.
Why did you have to? Do you feel like the strength of your arguments alone wouldn’t suffice?
That’s exactly it. This page is stuffed with identity politics, prewritten bottom lines, base-rate neglect, likely sexism, sheer abandonment of optimal charity, scope insensitivity, equivocation & abuse of fuzzies vs utilons, and so on.
This is all LW orthodoxy to the extent there is such a thing, yet even so, the pull of ‘dying cute girl wants cryonics! MUST HELP!’ is so strong that LW orthodoxy + good rhetoric* still earns me a mix of heavy down and upvotes with the flow of donations apparently unabated.
* I don’t think I’m very good at rhetoric, but I’ll take your word for it.
Do you think your strategy is channeling more money to efficient charities, as opposed to random personal consumption (such as a nice computer, movies, video games, or a personal cryonics policy)?
A more positive approach might work well: donate for fuzzies, but please extrapolate those feelings to many more utilons. I just used this technique to secure far more utilons than I have seen mentioned in this thread, and it seems like it might be the most effective among the LW crowd.
Great textbook example of the biases affecting charitable giving, isn’t it? People will give more to a single, identifiable person than to an anonymous person or a group. People want to feel like they actually changed something they can directly see, rather than contributing a small amount to a big goal; etc.
I think what hankx means is that (s)he’s not in a position to donate large amounts of money (as in large enough to save 50 or more life-years). However, $100 is still enough to buy warm fuzzies.
Of course you are, you just gave away $200. Good grief.
Not to pick on you… Well actually yes, to pick on you: What the hell is wrong with you people? If this were religious-oriented—for a pilgrimage to Mecca or buy Mormon underpants or pay for one last course of Scientology auditing—you’d be laughing your ass off hysterically! But because it’s cryonics...
How could you fail and compartmentalize so epically? This is like, fractally bad: at every level, donating is a bad idea. It’s probably a scam, so donating is a bad idea; if it weren’t a scam, you still have no idea what she would really do with it or how close to the cryonics fee she’d come, so donating is a bad idea; even if she would collect enough, donating to ALCOR or the Brain Preservation Prize is a better idea; even if you wanted to donate to them, they’re still almost certainly not as good as Givewell’s best charity; and so on.
As Konkvistador points out, I don’t think people are being philanthropic, they’re purchasing fuzzies.
Well they’re going to lose a whole lot of fuzzies when the scam is exposed.
Would you care to give a probability?
A scam like this is hard to expose if done right. How would you expose it in the absence of specific actionable information like a name, whose omission can be justified on privacy grounds? (Even a frontal photograph would give you little to go on unless you were lucky.)
As NancyLebovitz asked MileyCyrus… would you care to put odds on this being a scam? I see three broad outcomes: exposed as scam, confirmed by CI as not a scam (or similar strong evidence), and something inconclusive. A prediction on all three options (or more, if you see interesting ones) would be nice, or feel free to lump any two together.
I’ll give an informal prediction: 5% of scam exposed, more likely confirmed by CI than not.
Before Hank’s claim about the Facebook profile (for which I will take his word), I would’ve given 75% chance of an underlying fraud. With the years of Facebooking up front, I’d revise downwards to 5-10%. Will CI confirm or will everyone be left hanging? That’s harder… She might just be too lazy or events could intervene. I don’t have a good guess about that sort of thing, so I’d too only give around >60% odds of CI confirming her.
Still hoping she might get some cryo-financial assistance from her Christian family could have inhibited her from making her identity completely transparent. Becoming a cryonics cause-celebre could also be the last straw in her relations with her parents.
So, let’s take bets, given those odds: 5% chance of a confirmed scam, 50% chance confirmed not a scam, and 45% chance that evidence is inconclusive.
So, you would gladly accept cash bets where you lose $5 if it is confirmed not a scam and gain $1 if it confirmed a scam. Anyone willing to bet against those odds?
I think you got the numbers backwards.
I would take a bet where I lose $5 if it is confirmed a scam and gain $1 if it is confirmed not a scam. (I’m assuming the bet is off if there is no confirmation either way.)
At $5 vs $1, I’m not sure the bet is worth the hassle. Make it $25 vs $5, and I’m in. I’m happy to work with payment via paypal or mailed cash or check.
Myself, I judge the odds differently: I’d call it about a 20% chance of a scam, but a .2% chance of what I would call confirmation of that; and about an 80% chance that it is confirmed not to be a scam (such as by a independent verification from the CI. I can’t describe the criteria which, if met, would result in a inconclusive result, so I won’t take any bets on that outcome; I also won’t take the small bet, and can’t afford the large bet.
Assuming this is not a scam, I would donate for practical reasons (and not only fuzzies) - for those who plan to be frozen, we want cryonics to be popular. A public incident like this might make it into news, etc., and make a difference. Plus Reddit has gotten quite big.
There are a lot of things I’d like to say, but you have put forth a prediction
I would like to take up a bet with you on this ending up being a scam. This can be arbitrated by some prominent member of CI, Alcor, or Rudi Hoffman. I would win if an arbiter decides that the person who posted on Reddit was in fact diagnosed with cancer essentially as stated in her Reddit posts, and is in fact gathering money for a her own cryonics arrangements. If none of the proposed arbiters can vouch for the above within one month (through September 18), then you will win the bet.
What odds would you like on this, and what’s the maximum amount of money you’d put on the line?
As I said in my other comment, I’m now giving 5-10% for scam. I’d be happy to take a 1:10 bet on the CI outcome risking no more than $100 on my part, but I think 1 month is much too tight; 1 January 2013 would be a better deadline with the bet short-circuiting on CI judgment.
Done. $100 from you vs $1000 from me. If you lose, you donate it to her fund. If I lose, I can send you the money or do with it what you wish.
Wait, I’m not sure we’re understanding each other. I thought I was putting up $100, and you’d put up $10; if she turned out to be a scam (as judged by CI), I lose the $100 to you—while if she turned out to be genuine (as judged by CI), you would lose the $10 to me.
Well I still accept, since now it’s a much better deal for me!
Um, the way I’m reading this it looks like gwern is taking the position you were originally trying to take?
Yes, that’s my take too… I’m not sure what mtaran is doing here—maybe he doesn’t care at all about the odds or which side he’s taking? I don’t mind betting, but I do insist that—when it’s for real sums of money like $100 - that my counterparty knows what he’s agreeing to. And I’m not sure mtaran does.
For what it’s worth, I too haven’t understood what side you are taking. Usually people bet on outcomes which they think are more probable than their opponent thinks.
I spent way too much time thinking about the same thing. It seems to me, if mtaran believes chance of scam is lower than Gwern’s, they should both agree that Gwern take the larger payout for smaller chance of being correct about it being a scam.
For what it’s worth I do not think that mtaran’s original bet is was that great for Gwern to begin with. A $1,000 to $100 bet implies odds of scam is at 9.1%, however Gwern stated his probability is 5%-10% putting the odds at the higher end of Gwern’s estimate. If Mtaran is truly confident he needs to offer at least $1,900 to $100 payout for as this will match Gwern’s lowest percentage for being a scam (5%).
Ok, I misread one of gwern’s replies. My original intent was to extract money from the fact that gwern gave (from my vantage point) too high a probability of this being a scam.
Under my original version of the terms, if his P(scam) was .1:
he would expect to get $1000 .1 of the time
he would expect to lose $100 .9 of the time
yielding an expected value of $10
Under my original version of the terms, if his P(scam) was .05:
he would expect to get $1000 .05 of the time
he would expect to lose $100 .95 of the time
yielding an expected value of -$45
In the second case, he would of course not want to take that bet. I’d thus like to amend my suggested conditions to have gwern only put $52 at stake against my $1000. For any P(scam) > .05 this is a positive expected value, so I would expect it to have been satisfactory to gwern[19 August 2012 01:53:58AM].
Alright then, I accept. The wager is thus:
on 1 January 2013, if CI confirms that she is really dying and has or is in the process of signing up with membership & life insurance, then I will pay you $52; if they confirm the opposite, confirm nothing, or she has made no progress, you will pay me $1000.
In case of a dispute, another LWer can adjudicate; I nominate Eliezer, Carl Shulman, Luke, or Yvain (I haven’t asked but I doubt all would decline).
For me, paying with Paypal is most convenient, but if it isn’t for you we can arrange something else (perhaps you’d prefer I pay the $52 to a third party or charity). I can accept Paypal or Bitcoin.
This isn’t fair. She probably can’t get life insurance with a terminal diagnosis. She’s more likely to pay for the suspension outright.
Oh right; I’m so used to cryonics funding always being life insurance I forgot it wouldn’t apply here. Hm… I was trying for some sort of verification of financial commitment to being preserved. Suggestions on wording?
My understanding is that if CI confirms this, they’re likely to set up a fund, which is what I plan to donate to.
I’m not sure at what point during the process CI would take the suspension money, so I don’t know.
Genius, I should have thought of that ^_^
From a hedonistic point of view, what would you propose to be a better way to buy warm fuzzies for those already moved by and emotionally invested in the story?
Maybe a charity specializing in Africans which will send you pictures of little kids? Another option might be to go to the local pound, play with some of the kittens, and then donate; if warm fuzzy kittens and cats don’t get you warm fuzzies, I dunno what will!
(Best of course would be to not fall for the trap in the first place.)
You’re missing the similarity drive. Pictures of smiling Africans is different from “this girl thinks like me”- the former are just kindred bodies, the latter are kindred spirits.
Not to fall into the “trap” of buying warm fuzzies? Do you advocate a policy of never buying yourself any warm fuzzies, or just of never buying warm fuzzies specifically through donating to charity (because it’s easy to trick your brain into believing it just did good)?
Yes, I am deeply suspicious of Eliezer’s post on warm fuzzies vs utilons because while I accept that it can be a good strategy, I am skeptical that it actually is: my suspicion is that for pretty much all people, buying fuzzies just crowds out buying utilons.
For example, I asked Konkvistador on IRC, since he was planning on buying fuzzies by donating to this person, what utilons he was planning on buying, especially since he had just mentioned he had very little money to spare. He replied with something about not eating ice cream and drinking more water.
I was going for how this increase in fuzzy spending would be counteracted by me specifically cutting out fuzzy spending elsewhere, so total fuzzy spending remains unchanged by this particular decision.
Also me loosing weight does bring me utility.
So now you’re passing the buck twice: you’re passing the buck from donating to her to actually cutting down on the ice cream, and from there, you’re passing the buck to having increased hedons to at some point buying more utilons. Do you see why this sort of reasoning makes me go all -_- ?
More rationalizing doesn’t make me feel better either!
You’ve shamed me not currently donating much to optimal charity. This has caused me to want to lower my current warm fuzzies spending increasing my available financial resources and give more to SIAI (which I currently think is the optimal charity). Thus I’ve decided to cut the last thing on the list of stuff I enjoy: Ice cream and sweets. The loosing weight comment was indeed searching for a silver lining to cutting it off my list.
If you hadn’t shamed me I’d still be happily enjoying my large supply of warm fuzzies from ice cream and other things, but you’ve unfortunately devalued it depriving me of that fun. That’s probably worth it for you since its more than offset by gains in other people.
I am however pretty bothered by how worked up some people where getting over this. I’m pretty certain that if I shared a good deal for buying a huge collection of anime on LessWrong and said that some people might find this valuable and they should totally check it out, no one would bring up optimal charity. I explicitly put this in the same bin as buying anime yet people attack it nevertheless.
I can see why.
Looking at myself from the outside I think I’m clearly defending an emotional attachment to a course of action. But I don’t remember deceiving myself or anyone else into thinking this was something worth doing on efficient charity grounds for the good of the world. This was framed as personal expenditures spending and talking about such spending. We do this all the time, check out the fictional book and other media recommendation threads we have running. I don’t see optimal charity advocates stomping there haranguing people for consuming such media. Indeed I can take this comparison further in this case and point out she is producing media (videos, writing), lots of people consume media that is available for free yet still donate to the authors for creating such content.
I didn’t even expect as many people as have to donate or feel inclined to donate on LessWrong, I was actually hoping more for people here to go and engage the pro-death arguments on reddit than donate themselves. I also made the argument that efficient charity advocacy in that thread (thought perhaps not here) won’t result in more efficient charity, but anti-cryonics or pro-death memes might be impacted in a way we’d all like.
Fuzzies are utilons.
Personally, I prefer Kickstarter projects for my fuzzies, because they typically also come with direct physical rewards.
Not sure if your suggestion would work at this stage. This dying (assuming it’s not a scam) person is already real and embedded in their hearts, especially if they read her older posts. They would have to pick cute kittens or sad pictures over a cancer girl, not an obvious decision. Like Murder-Gandhi, they have been irreversibly changed and would require a sobering pill to snap out of it, which they would probably refuse to take.
Assuming her story is not a scam, ponder why I find the idea of donating for cute kittens instead of helping another human being facing death and begging for help repugnant.
I have; now please ponder why I might find repugnant the idea of donating towards something as inefficient and low-probability as cryonics rather than the very high probability charities identified by GiveWell, based solely on some identity politics and a Reddit post.
If everyone is going to justify donating to her on fuzzies, then have the guts to defend fuzzies. Fuzzies are not a good way of helping human beings ‘facing death’: that’s the point. Don’t equivocate between arguing that donating to her is a good way of making you feel better, and arguing that donating to her is a utilitarianly optimal sort of donation.
You make an interesting assumption that we care about other people in general. If you assume that we model the human species as a group of people with the bell curve split fifty percent above and below the zero value line symmetrically, then it’s perfectly rational to give only to people who are familiar enough with to rank in the positive half.
Note: I do not believe this.
Also, if you actually believe in optimal charity for utilitarian reasons, then abusing people for sub-optimal charity is ridiculous. It does not make them more likely to engage in optimal charity, it makes them more likely not to engage in charity at all. You’re shooting your cause in the foot at least as much as they are.
It may make them overall less likely to engage in charity, yes, but if they do, it also makes them more likely to engage in optimal charity*. Since optimal charity is something like 2-3 orders of magnitude better than this particular instance of fuzzy charity, I should be willing to cause a lot of overall drops in charity in exchange for diverting a small fraction of that to an optimal charity.
* If it doesn’t even do that, though, then I have some serious problems on my hand.
The next time people are presented with an opportunity for charity(any opportunity), their last memory is now changed from ‘hey, I was charitable a couple of months ago, and that was nice’ to ‘hey, I was charitable a couple of months ago, and this optimal-charity jerk made me feel terrible about it.’
You’re making them less likely to give in general, and, by being rude about it, you’re also damaging the PR brand of your cause, which will hurt you more than you think. I don’t know of any corporation that advertises its product by abusing its customers.
This is likely to be the case if gwern were to act in such a way in the vast majority of environments. However, in this particular online community, criticizing people for publicly donating to suboptimal charity may well be a fairly good method for gwern to produce utilons.
Indeed. Consistent with this situational point, I also recently advised not attempting to go over the The Oatmeal and related forums and evangelizing for optimal charity.
http://xkcd.com/871/
Which doesn’t address my point, but just reiterates the argument of the first comment.
I have. You know what, you’re perfectly right, there are better ways to help people, and that’s even if you’re selfish and wish to help groups in which you’re likely to find yourself, for instance setting a precedent of people helping needy, terminally ill cryonics patients because “someday I could be in her shoes”.
You’re also too good at rhetoric for your own good. I wouldn’t have been so distracted from the content of your message if you hadn’t been acting so aggressive, indignant and grandiloquent in the comments from the beginning on. Why did you have to? Do you feel like the strength of your arguments alone wouldn’t suffice? Or were you too engrossed in the game of putting your ideas forward and destroying those on the other side?
That’s exactly it. This page is stuffed with identity politics, prewritten bottom lines, base-rate neglect, likely sexism, sheer abandonment of optimal charity, scope insensitivity, equivocation & abuse of fuzzies vs utilons, and so on.
This is all LW orthodoxy to the extent there is such a thing, yet even so, the pull of ‘dying cute girl wants cryonics! MUST HELP!’ is so strong that LW orthodoxy + good rhetoric* still earns me a mix of heavy down and upvotes with the flow of donations apparently unabated.
* I don’t think I’m very good at rhetoric, but I’ll take your word for it.
Do you think your strategy is channeling more money to efficient charities, as opposed to random personal consumption (such as a nice computer, movies, video games, or a personal cryonics policy)?
A more positive approach might work well: donate for fuzzies, but please extrapolate those feelings to many more utilons. I just used this technique to secure far more utilons than I have seen mentioned in this thread, and it seems like it might be the most effective among the LW crowd.
Great textbook example of the biases affecting charitable giving, isn’t it? People will give more to a single, identifiable person than to an anonymous person or a group. People want to feel like they actually changed something they can directly see, rather than contributing a small amount to a big goal; etc.
As a counterpoint to your generalization, JGWeissman has given 82x more to SIAI than he plans to give to this girl if her story checks out.
And how many JGWs are there in the world?
More and more, if I can do anything about it. (Edit since someone didn’t like this comment: That’s a big if. I’m trying to make it smaller.)
I think what hankx means is that (s)he’s not in a position to donate large amounts of money (as in large enough to save 50 or more life-years). However, $100 is still enough to buy warm fuzzies.
No, I mean this is not just about fuzzies for me.