Are there more detailed discussions of this risk that you can link to? I’m curious about synthetic biology’s practitioners’ justifications (or rationalizations) for having few safety measures, given how seemingly obvious the risk is to an outsider. In general, it seems like you’re wasting the attention that you’re drawing to this countdown by not pointing people to more information.
As a synthetic biologist, I justify the lack of regulation of my work by saying “I don’t do that kind of synthetic biology.” Saying “synthetic biology” could lead to a super-plague is like saying “computer science” could lead to an uFAI. Sure, but the majority of the work is harmless. The VAST majority. The only real safety measures I can think of that would work at this point would be censorship of disease research.
Do you think that eventually an individual or a small group of people (terrorists, doomsday cultists, etc.) will be able to engineer a very destructive biological weapon using off the shelf synthetic biology tools? Or will synthetic biology never develop to that point, or it’s likely that synthetic biology or other technologies will give us defenses against such weapons before they become possible, or do you see effective regulation happening first, or something else?
For a given value of off-the-shelf, I think that’s doable now. Not x-risk destructive, but millions dead. I haven’t looked too closely at it in the hopes of staying off watch-lists. The materials necessary are much easier to obtain than those necessary for other sorts of weapon, so I’m guessing our only defense right now is the high level of education required to do it. Regulating this will be like trying to keep people from using 3D-printers to make weapons: your only real option is to forbid all the printers. And you couldn’t just forbid “synthetic biology”. You would basically have to take all molecular biology off the table to prevent people from being able to build diseases.
Are there more detailed discussions of this risk that you can link to? I’m curious about synthetic biology’s practitioners’ justifications (or rationalizations) for having few safety measures, given how seemingly obvious the risk is to an outsider. In general, it seems like you’re wasting the attention that you’re drawing to this countdown by not pointing people to more information.
As a synthetic biologist, I justify the lack of regulation of my work by saying “I don’t do that kind of synthetic biology.” Saying “synthetic biology” could lead to a super-plague is like saying “computer science” could lead to an uFAI. Sure, but the majority of the work is harmless. The VAST majority. The only real safety measures I can think of that would work at this point would be censorship of disease research.
Do you think that eventually an individual or a small group of people (terrorists, doomsday cultists, etc.) will be able to engineer a very destructive biological weapon using off the shelf synthetic biology tools? Or will synthetic biology never develop to that point, or it’s likely that synthetic biology or other technologies will give us defenses against such weapons before they become possible, or do you see effective regulation happening first, or something else?
For a given value of off-the-shelf, I think that’s doable now. Not x-risk destructive, but millions dead. I haven’t looked too closely at it in the hopes of staying off watch-lists. The materials necessary are much easier to obtain than those necessary for other sorts of weapon, so I’m guessing our only defense right now is the high level of education required to do it. Regulating this will be like trying to keep people from using 3D-printers to make weapons: your only real option is to forbid all the printers. And you couldn’t just forbid “synthetic biology”. You would basically have to take all molecular biology off the table to prevent people from being able to build diseases.
Alas, there’s very little info! I’ve been going through some papers today, but I don’t trust the quality.
This is one of the things the FHI would do, with a lot more resources.