Let G = the number of civilizations in our galaxy that have reached our level of development.
I agree with you that for sufficiently large values of G we are left with either “careful enforcement of late filter by alien intelligence” or “flawed assumptions” For sufficiently low G we don’t have to fear the great filter.
But for a medium range G (1000 gladiators) we should be very afraid if it, and I think this is the most likely situation since the higher the G, so long as G isn’t so large as to create absurdities (if we assume away the zoo hypothesis and alien exterminators), observers like us are common. What’s needed is some kind of mathematical model that captures the tradeoff between the Fermi paradox getting worse and the anthrorpics making observers such as us more common as G increases.
Let G = the number of civilizations in our galaxy that have reached our level of development.
I agree with you that for sufficiently large values of G we are left with either “careful enforcement of late filter by alien intelligence” or “flawed assumptions” For sufficiently low G we don’t have to fear the great filter.
But for a medium range G (1000 gladiators) we should be very afraid if it, and I think this is the most likely situation since the higher the G, so long as G isn’t so large as to create absurdities (if we assume away the zoo hypothesis and alien exterminators), observers like us are common. What’s needed is some kind of mathematical model that captures the tradeoff between the Fermi paradox getting worse and the anthrorpics making observers such as us more common as G increases.