I’ll make the point I want to make in a snarky way, and then in a less confrontational way.
Snarky: Tune in again next week to hear about how the real goal of effective altruists should be nothing less than eternal bliss for everyone, and EA donations should therefore all be directed to Christian missionary organizations!
Less confrontational: It isn’t (around here) very controversial to suggest that reducing human suffering is a worthy goal. It is very controversial indeed to suggest that the pursuit of “enlightenment” is in practice a feasible way to remove more than a small fraction of human suffering. (In communities where that pursuit is widely carried out and a lot of time and effort goes into it, so far as I can tell it’s still the case that few if any of the people involved would claim to have stopped feeling suffering.) Calling for effective altruists to investigate how best to pour lots of resources into helping people get “enlightened” (through meditation, psychotropic drugs, or whatever) seems waaaaay premature.
It’s a good subject for research, to be sure. But urging EAs as such to dive into it seems like what you do after that research has shown that yes, there is a reliable process by which people can be enabled never to suffer again and no, it doesn’t have side effects likely to wreck their lives or others’.
Why the snarky comment is a brief version of the less confrontational one: It absolutely could turn out that Christianity is right, that a person’s eternal destiny depends on whether they’re Christian, and that trying to make lots of people into Christians is in fact the most cost-effective way of making people happier in the long (or indeed eternal) term. But on the face of it that seems unlikely, and if you disagree then you should be gathering compelling evidence for it before advocating on LW or recommending it to EAs in general. And exactly the same things apply to the possibility that the most cost-effective way to improve human wellbeing involves trying to get everyone “enlightened”. It’s not a religious question as such (despite the obvious Buddhist connection) but it is a question whose answer is nowhere near settled in a way that would make this an appropriate thing to push on EAs.
I’ll make the point I want to make in a snarky way, and then in a less confrontational way.
Snarky: Tune in again next week to hear about how the real goal of effective altruists should be nothing less than eternal bliss for everyone, and EA donations should therefore all be directed to Christian missionary organizations!
Less confrontational: It isn’t (around here) very controversial to suggest that reducing human suffering is a worthy goal. It is very controversial indeed to suggest that the pursuit of “enlightenment” is in practice a feasible way to remove more than a small fraction of human suffering. (In communities where that pursuit is widely carried out and a lot of time and effort goes into it, so far as I can tell it’s still the case that few if any of the people involved would claim to have stopped feeling suffering.) Calling for effective altruists to investigate how best to pour lots of resources into helping people get “enlightened” (through meditation, psychotropic drugs, or whatever) seems waaaaay premature.
It’s a good subject for research, to be sure. But urging EAs as such to dive into it seems like what you do after that research has shown that yes, there is a reliable process by which people can be enabled never to suffer again and no, it doesn’t have side effects likely to wreck their lives or others’.
Why the snarky comment is a brief version of the less confrontational one: It absolutely could turn out that Christianity is right, that a person’s eternal destiny depends on whether they’re Christian, and that trying to make lots of people into Christians is in fact the most cost-effective way of making people happier in the long (or indeed eternal) term. But on the face of it that seems unlikely, and if you disagree then you should be gathering compelling evidence for it before advocating on LW or recommending it to EAs in general. And exactly the same things apply to the possibility that the most cost-effective way to improve human wellbeing involves trying to get everyone “enlightened”. It’s not a religious question as such (despite the obvious Buddhist connection) but it is a question whose answer is nowhere near settled in a way that would make this an appropriate thing to push on EAs.