For example, should mankind vigorously pursue research on how to make Ron Fouchier’s alteration of the H5N1 bird flu virus even more dangerous and deadly to humans...
Trivially speaking, I would say “yes”.
More specifically, though, I would of course be very much against developing increasingly more dangerous viral biotechnologies. However, I would also be very much in favor of advancing our understanding of biology in general and viruses in particular. Doing so will enable us to cure many diseases and bioengineer our bodies (or anything else we want to engineer) to highly precise specifications; unfortunately, such scientific understanding will also allow us to create new viruses, if we chose to do so. Similarly, the discovery of fire allowed us to cook our food as well as set fire to our neighbours. Overall, I think we still came out ahead.
I think there is something wrong with your analogy with the fire. The thing is that you cannot accidentally or purposefully burn all the people in the world or the vast majority of them by setting fire to them, but with a virus like the one Luke is talking about you can kill most people.
Yes, both a knife and an atomic bomb can kill 100.000 people. It is just way easier to do it with the atomic bomb. That is why everybody can have a knife but only a handful of people can “have” an atomic bomb. Imagine what the risks would be if we would give virtually everybody who would be interested, all the instructions on how to build a weapon 100 times more dangerous than an atomic bomb (like a highly contagious deadly virus).
The thing is that you cannot accidentally or purposefully burn all the people in the world or the vast majority of them by setting fire to them...
Actually, you could, if your world consists of just you and your tribe, and you start a forest fire on accident (or on purpose).
Yes, both a knife and an atomic bomb can kill 100.000 people. It is just way easier to do it with the atomic bomb. That is why everybody can have a knife but only a handful of people can “have” an atomic bomb.
Once again, I think you are conflating science with technology. I am 100% on board with not giving out atomic bombs for free to anyone who asks for one. However, this does not mean that we should prohibit the study of atomic theory; and, in fact, atomic theory is taught in high school nowadays.
When Luke says, “we should decelerate AI research”, he’s not saying, “let’s make sure people don’t start build AIs in their garages using well-known technologies”. Rather, he’s saying, “we currently have no idea how to build an AI, or whether it’s even possible, or what principles might be involved, but let’s make sure no one figures this out for a long time”. This is similar to saying, “these atomic theory and quantum physics things seem like they might lead to all kinds of fascinating discoveries, but let’s put a lid on them until we can figure out how to make the world safe from nuclear annihilation”. This is a noble sentiment, but, IMO, a misguided one. I am typing these words on a device that’s powered by quantum physics, after all.
Trivially speaking, I would say “yes”.
More specifically, though, I would of course be very much against developing increasingly more dangerous viral biotechnologies. However, I would also be very much in favor of advancing our understanding of biology in general and viruses in particular. Doing so will enable us to cure many diseases and bioengineer our bodies (or anything else we want to engineer) to highly precise specifications; unfortunately, such scientific understanding will also allow us to create new viruses, if we chose to do so. Similarly, the discovery of fire allowed us to cook our food as well as set fire to our neighbours. Overall, I think we still came out ahead.
I think there is something wrong with your analogy with the fire. The thing is that you cannot accidentally or purposefully burn all the people in the world or the vast majority of them by setting fire to them, but with a virus like the one Luke is talking about you can kill most people.
Yes, both a knife and an atomic bomb can kill 100.000 people. It is just way easier to do it with the atomic bomb. That is why everybody can have a knife but only a handful of people can “have” an atomic bomb. Imagine what the risks would be if we would give virtually everybody who would be interested, all the instructions on how to build a weapon 100 times more dangerous than an atomic bomb (like a highly contagious deadly virus).
Actually, you could, if your world consists of just you and your tribe, and you start a forest fire on accident (or on purpose).
Once again, I think you are conflating science with technology. I am 100% on board with not giving out atomic bombs for free to anyone who asks for one. However, this does not mean that we should prohibit the study of atomic theory; and, in fact, atomic theory is taught in high school nowadays.
When Luke says, “we should decelerate AI research”, he’s not saying, “let’s make sure people don’t start build AIs in their garages using well-known technologies”. Rather, he’s saying, “we currently have no idea how to build an AI, or whether it’s even possible, or what principles might be involved, but let’s make sure no one figures this out for a long time”. This is similar to saying, “these atomic theory and quantum physics things seem like they might lead to all kinds of fascinating discoveries, but let’s put a lid on them until we can figure out how to make the world safe from nuclear annihilation”. This is a noble sentiment, but, IMO, a misguided one. I am typing these words on a device that’s powered by quantum physics, after all.