EA could have different well-defined membership levels, such as gold, silver and bronze, all of which correspond to different amounts of money or labor regularly donated.
It feels like it would be counter-productive to assign people levels. There is a sort-of two-tier membership level already: The Life You Can Save Pledge (>1%) and the Giving What We Can Pledge (>10%). I encourage cheering people on (and engaging with them in a productive conversation) no matter where they currently are—even if they are just giving $20 to an obviously non-cost-effective charity.
Firms that sell low marginal cost products face the problem of how to get the most revenue from customers when some customers are willing to pay a lot more than others. One solution is to just sell to customers willing to pay a lot, which seems analogous to EA not recognizing soft-core donors. Another solution is to sell different quality products to customers in the hopes that you get soft-core customers to pay a bit, but still can get lots of revenue from customers that place a greater monetary value on your product.
It seems like a valid application to me if one takes people to be paying some kind of cost in terms of their time and money in order to receive recognition for being a good person. You would like people to be able to spend a moderate amount of time and money to receive a moderate amount of recognition, and a large amount of time and money to receive a large amount of recognition. Having terms for different levels could help with this.
Literally every other website on the internet will allow bad arguments like this… are you sure you don’t want to hang out somewhere else? Seriously, give it some thought.
Becoming a superdonor would be the same as becoming an EA. I’m not convinced that we should have levels of EA-ship or Superdonor-ship. I’m open to being convinced, of course, I just don’t intuitively see the value of it.
...charisma is not a bloody dump stat, and you can’t just substitute “Intelligence” in for it, no matter what feats you take.
Bloody hell. Great, somebody came up with an idea on how to make EA work. Only, it’s an absolutely terrible idea, but since nobody has come up with anything else, you people are going for it.
This change doesn’t pattern-match with status. It pattern-matches with cult, for the exact same reason Gleb’s behavior pattern matches with sociopath. And the reason is that you are blindly tampering with the exact same mental levers cults (and sociopaths) tamper with, with a likewise imperfect understanding of how they work. We don’t call them cults or sociopaths when they’re good at what they do, you see, because we do not notice.
But even if there weren’t pattern-matching which will completely nuke any credibility EA has as a result of this dropout-of-cult-leader-school approach, what would an ordinary person do, if confronted with a symbol that indicated that somebody has more status than them?
They’ll buy into the first half-credible explanation on why that status symbol doesn’t mean anything, and then they’ll feel higher status than all the “dumb” people buying into that status symbol, because few people can tell the difference between cynicism and intelligence (disturbingly few people even realize the two concepts are distinct, actually). Think about all the people -already doing that- because some villagers apparently fish with their mosquito nets, thus disproving EA can be useful—clearly the argument is fallacious, but that doesn’t matter, because they’re rationalizing what they already want to believe, because EA is already threatening their sense of relative status.
Remember the HPMoR bit about Phoenixes as status symbols? Same principle.
Gleb Tsipursky’s point is that this kind of stratification is what happens anyway. You want softcore EAs to get more recognition than that, because their contribution to EA as a social group is mostly hidden and not linked to visible effort within the EA movement.
EA could have different well-defined membership levels, such as gold, silver and bronze, all of which correspond to different amounts of money or labor regularly donated.
It feels like it would be counter-productive to assign people levels. There is a sort-of two-tier membership level already: The Life You Can Save Pledge (>1%) and the Giving What We Can Pledge (>10%). I encourage cheering people on (and engaging with them in a productive conversation) no matter where they currently are—even if they are just giving $20 to an obviously non-cost-effective charity.
What purpose do you think this arrangement might serve?
Firms that sell low marginal cost products face the problem of how to get the most revenue from customers when some customers are willing to pay a lot more than others. One solution is to just sell to customers willing to pay a lot, which seems analogous to EA not recognizing soft-core donors. Another solution is to sell different quality products to customers in the hopes that you get soft-core customers to pay a bit, but still can get lots of revenue from customers that place a greater monetary value on your product.
Interesting. So the implication would be to draw a clear difference between softcore and hardcore EAs, and give appropriate plaudits to each?
The purpose of market segmentation is to maximize revenue :-/
It seems like a valid application to me if one takes people to be paying some kind of cost in terms of their time and money in order to receive recognition for being a good person. You would like people to be able to spend a moderate amount of time and money to receive a moderate amount of recognition, and a large amount of time and money to receive a large amount of recognition. Having terms for different levels could help with this.
Let me rephrase it as ‘I can buy the status of “a good person”’. Still fine?
http://lesswrong.com/lw/e95/the_noncentral_fallacy_the_worst_argument_in_the/
You’re arguing by analogy: http://lesswrong.com/lw/vx/failure_by_analogy/ and trying to do guilt by association. It’s an appeal to emotion, not reason.
Literally every other website on the internet will allow bad arguments like this… are you sure you don’t want to hang out somewhere else? Seriously, give it some thought.
LOL. You have to try harder :-P
Seems like the same idea as your “becoming a superdonor” idea—you want to attach status to doing a lot of good.
Becoming a superdonor would be the same as becoming an EA. I’m not convinced that we should have levels of EA-ship or Superdonor-ship. I’m open to being convinced, of course, I just don’t intuitively see the value of it.
...charisma is not a bloody dump stat, and you can’t just substitute “Intelligence” in for it, no matter what feats you take.
Bloody hell. Great, somebody came up with an idea on how to make EA work. Only, it’s an absolutely terrible idea, but since nobody has come up with anything else, you people are going for it.
This change doesn’t pattern-match with status. It pattern-matches with cult, for the exact same reason Gleb’s behavior pattern matches with sociopath. And the reason is that you are blindly tampering with the exact same mental levers cults (and sociopaths) tamper with, with a likewise imperfect understanding of how they work. We don’t call them cults or sociopaths when they’re good at what they do, you see, because we do not notice.
But even if there weren’t pattern-matching which will completely nuke any credibility EA has as a result of this dropout-of-cult-leader-school approach, what would an ordinary person do, if confronted with a symbol that indicated that somebody has more status than them?
They’ll buy into the first half-credible explanation on why that status symbol doesn’t mean anything, and then they’ll feel higher status than all the “dumb” people buying into that status symbol, because few people can tell the difference between cynicism and intelligence (disturbingly few people even realize the two concepts are distinct, actually). Think about all the people -already doing that- because some villagers apparently fish with their mosquito nets, thus disproving EA can be useful—clearly the argument is fallacious, but that doesn’t matter, because they’re rationalizing what they already want to believe, because EA is already threatening their sense of relative status.
Remember the HPMoR bit about Phoenixes as status symbols? Same principle.
Gleb Tsipursky’s point is that this kind of stratification is what happens anyway. You want softcore EAs to get more recognition than that, because their contribution to EA as a social group is mostly hidden and not linked to visible effort within the EA movement.