Saving 725,000 lives per year on its own is an absurdly heroic act. Some pretty good EAs have already spent billions of dollars trying to do something way less impactful. Against that heroic act, considerations like “Don’t do things without a permit from the Senegalese government”/”Don’t vaguely associate do-gooders with political pariahs” fail to even begin to tip the scales.
Would anyone here consider afflicting millions with Malaria to protect the good image of gene drives and effective altruists, and maybe prevent a local government from being angry at them? I sure hope not, and even in less important circumstances I think if the groups I love (rationalists, EAs) make tradeoffs of PR vs. actually helping others they will lose 99% of what makes them a valuable shining light for civilization in the first place. Having a strong deontology is great, but there’s simply no room for undue concern about Vox or Hitler when it comes to making these high stakes decisions.
Unfortunately for those doing this kind of research, the incentives are such that the threat of getting called a Paternalist by an evil journalist or insane despot is going to be very salient, whereas few will notice or congratulate them if they release the gene drive a month/week/year faster than normal. I genuinely empathize with those put in this predicament, but I think the right move is still to notice the dynamic explicitly and then ignore those threats in favor of exhaustively addressing the real ones.
You’re making a really big assumption or at least stating things in a poor way.
Even if this plan worked and you saved 725,000 people from dying from malaria they are still living in a very dangerous condition in many cases and, purely to play devil’s advocate here, you may end up saving those 725,000 from dying this year while creating the condition of having 725,000 additional mouths to feed that leads to the death of 300,000 the following year.
Was that act of forcing a solution on those people so heroic in this case?
Now, this isn’t to say we should not do this (assuming we get the buy in for the plan from those being helped) but rather that we should be careful about where we’re drawing the box boundaries for the analysis to conclude it will be helpful. I have the impression that there is a bit of a myopia that is getting ignored.
One major second-order effect of doing something this dramatic is that you’d expect controls on gene editing technologies to be raised a lot/made at all, and an argument could be made that that would be a good thing.
There’s a tendency to think: If we believe that something should be illegal, we shouldn’t do it ourselves. In competitive arenas, this ends up disadvantaging the most responsible thinkers by denying them the fruits of defection without denying it to their competitors, or suppressing the acknowledgement of the regulatory holes as participants are afraid to look hypocritical if they acknowledge the need for regulation while thriving without it. It’s actually not hypocritical to exploit a hole while working to close it. Sometimes, spectacularly exploiting the hole is the only practical way to get it closed.
If that’s the consideration, then, again, sucks for that government I guess. I’d like to cure malaria and am not sure what valuable principles you’d violate by doing so.
Saving 725,000 lives per year on its own is an absurdly heroic act. Some pretty good EAs have already spent billions of dollars trying to do something way less impactful. Against that heroic act, considerations like “Don’t do things without a permit from the Senegalese government”/”Don’t vaguely associate do-gooders with political pariahs” fail to even begin to tip the scales.
Would anyone here consider afflicting millions with Malaria to protect the good image of gene drives and effective altruists, and maybe prevent a local government from being angry at them? I sure hope not, and even in less important circumstances I think if the groups I love (rationalists, EAs) make tradeoffs of PR vs. actually helping others they will lose 99% of what makes them a valuable shining light for civilization in the first place. Having a strong deontology is great, but there’s simply no room for undue concern about Vox or Hitler when it comes to making these high stakes decisions.
Unfortunately for those doing this kind of research, the incentives are such that the threat of getting called a Paternalist by an evil journalist or insane despot is going to be very salient, whereas few will notice or congratulate them if they release the gene drive a month/week/year faster than normal. I genuinely empathize with those put in this predicament, but I think the right move is still to notice the dynamic explicitly and then ignore those threats in favor of exhaustively addressing the real ones.
You’re making a really big assumption or at least stating things in a poor way.
Even if this plan worked and you saved 725,000 people from dying from malaria they are still living in a very dangerous condition in many cases and, purely to play devil’s advocate here, you may end up saving those 725,000 from dying this year while creating the condition of having 725,000 additional mouths to feed that leads to the death of 300,000 the following year.
Was that act of forcing a solution on those people so heroic in this case?
Now, this isn’t to say we should not do this (assuming we get the buy in for the plan from those being helped) but rather that we should be careful about where we’re drawing the box boundaries for the analysis to conclude it will be helpful. I have the impression that there is a bit of a myopia that is getting ignored.
One major second-order effect of doing something this dramatic is that you’d expect controls on gene editing technologies to be raised a lot/made at all, and an argument could be made that that would be a good thing.
There’s a tendency to think: If we believe that something should be illegal, we shouldn’t do it ourselves. In competitive arenas, this ends up disadvantaging the most responsible thinkers by denying them the fruits of defection without denying it to their competitors, or suppressing the acknowledgement of the regulatory holes as participants are afraid to look hypocritical if they acknowledge the need for regulation while thriving without it. It’s actually not hypocritical to exploit a hole while working to close it. Sometimes, spectacularly exploiting the hole is the only practical way to get it closed.
If that’s the consideration, then, again, sucks for that government I guess. I’d like to cure malaria and am not sure what valuable principles you’d violate by doing so.