The post argues that the most effective way to achieve EA goals is to prioritize spreading EA-ish values over making arguments that will appeal only to people whose values are already EA-ish. I don’t know whether that’s correct, but I fail to see how figuring out what’s most effective and don’t it could be an abandonment of rationality in any sense that’s relevant here. Taking the path of least resistance—i.e., seeking maximum good done per unit cost—is pretty much the core of what EA is about, no?
Karma for the post is relatively low
OK. Inevitably some posts will have relatively low karma. On what grounds do you think this shouldn’t have been one of them?
moved more time value-adjusted money [...] over the next twenty years
I don’t think that’s at all what the post was assigning a 5% probability to.
The post argues that the most effective way to achieve EA goals is to prioritize spreading EA-ish values over making arguments that will appeal only to people whose values are already EA-ish. I don’t know whether that’s correct, but I fail to see how figuring out what’s most effective and don’t it could be an abandonment of rationality in any sense that’s relevant here. Taking the path of least resistance—i.e., seeking maximum good done per unit cost—is pretty much the core of what EA is about, no?
OK. Inevitably some posts will have relatively low karma. On what grounds do you think this shouldn’t have been one of them?
I don’t think that’s at all what the post was assigning a 5% probability to.