I was thinking about how the existential risks affect each other—for example, a real world war might either destroy so much that high tech risks become less likely for a while, or lead to research which results in high tech disaster.
And we may get home build-a-virus kits before AI is developed, even if we aren’t cautious about AI.
I was thinking about how the existential risks affect each other—for example, a real world war might either destroy so much that high tech risks become less likely for a while, or lead to research which results in high tech disaster.
And we may get home build-a-virus kits before AI is developed, even if we aren’t cautious about AI.