If there is a future Great Filter, it seems likely it would be one of two things:
1) a science experiment that destroys the world even though there was no reason to think that it would.
2) something analogous to nuclear weapons except easily constructable by an individual using easily obtainable materials, so that as soon as people have the knowledge, any random person can inflict immense destruction.
Are there any strategies that would guard against these possibilities?
1: No. Well, in theory, an presence on the moons of neptune that could survive indefinately without contact would do it, but that’s not going to happen any time soon.
2: Arguably, we already live in this world. There are very destructive things in the canon of human knowledge, only people don’t conceptualize them as weapons at all, but merely as dangers to be avoided. So.. good news, this does not work as a filter, and the actually odd thing is that we do* think of runaway super criticality as a weapon. Conditioning by lots of wars to think of explosions as ways to kill people?
*I’m not going to name examples in this context, because that might theoretically “help” someone to think of said example as a weapon. Which would be bad.
If there is a future Great Filter, it seems likely it would be one of two things:
1) a science experiment that destroys the world even though there was no reason to think that it would.
2) something analogous to nuclear weapons except easily constructable by an individual using easily obtainable materials, so that as soon as people have the knowledge, any random person can inflict immense destruction.
Are there any strategies that would guard against these possibilities?
1: No. Well, in theory, an presence on the moons of neptune that could survive indefinately without contact would do it, but that’s not going to happen any time soon.
2: Arguably, we already live in this world. There are very destructive things in the canon of human knowledge, only people don’t conceptualize them as weapons at all, but merely as dangers to be avoided. So.. good news, this does not work as a filter, and the actually odd thing is that we do* think of runaway super criticality as a weapon. Conditioning by lots of wars to think of explosions as ways to kill people?
*I’m not going to name examples in this context, because that might theoretically “help” someone to think of said example as a weapon. Which would be bad.