Do you think this is a problem? It appears to me that no development is possible without some tail risk (which we obviously want to minimize wherever possible!). Can we come up with a realistic world in which technologic progress is used for peaceful purposes exclusively and never causes any negative surprises? Or a world that develops with zero tail risk?
IMO yes, it is a gigantic problem. I agree that there are tradeoffs where progress implies some amount of tail risk as a consequence. The thing is that I don’t think we are navigating those tradeoffs well. The analogy I like to use is that our technological progress is like giving a machine gun to a child. Bad things are bound to happen. To use that analogy, when/if we mature to the level of Competent Adult or something, that would be the time to start playing with machine guns.
Do you think this is a problem? It appears to me that no development is possible without some tail risk (which we obviously want to minimize wherever possible!). Can we come up with a realistic world in which technologic progress is used for peaceful purposes exclusively and never causes any negative surprises? Or a world that develops with zero tail risk?
IMO yes, it is a gigantic problem. I agree that there are tradeoffs where progress implies some amount of tail risk as a consequence. The thing is that I don’t think we are navigating those tradeoffs well. The analogy I like to use is that our technological progress is like giving a machine gun to a child. Bad things are bound to happen. To use that analogy, when/if we mature to the level of Competent Adult or something, that would be the time to start playing with machine guns.