This is a very good metaphor and I approve of you making it.
But it only works when there are a reasonably small number of predecessors.
If there have been a thousand gladiators before you, fine.
If there have been a trillion gladiators before you, someone else has had this idea, and that makes the absence of names on the coliseum ten times more scary.
If there have been a trillion gladiators before you, the conditions have been in place for gladiators to bribe others to put their names on the wall since the beginning of gladiating, and there are still no names on the wall, then you are fundamentally misunderstanding some aspect of the situation. Either people are lying when they say there have been a trillion gladiators before you, or people are lying on a much more fundamental level—for example, the walls of the Coliseum are wiped clear once a year, or this entire scenario is a dream.
If we assume a late filter and set up this scenario, the obvious question becomes “What happened to the last civilization who tried this?”
And the obvious answer is “nothing good”.
Since it seems unlikely that every past civilization collapsed by coincidence just before it could implement this idea, then with certain strong assumptions like “very many civilizations” and “ease of galaxy-wide transmission”, we are left with only the possibility of early filter, or careful enforcement of late filter by alien intelligence.
The reason that this scenario requires such nonsensical decision theory is because it’s based on flawed assumptions—that this state of affairs plus a late filter could ever come about naturally.
I hope this seems like a logical development of what I said in the post you linked.
Let G = the number of civilizations in our galaxy that have reached our level of development.
I agree with you that for sufficiently large values of G we are left with either “careful enforcement of late filter by alien intelligence” or “flawed assumptions” For sufficiently low G we don’t have to fear the great filter.
But for a medium range G (1000 gladiators) we should be very afraid if it, and I think this is the most likely situation since the higher the G, so long as G isn’t so large as to create absurdities (if we assume away the zoo hypothesis and alien exterminators), observers like us are common. What’s needed is some kind of mathematical model that captures the tradeoff between the Fermi paradox getting worse and the anthrorpics making observers such as us more common as G increases.
This is a very good metaphor and I approve of you making it.
But it only works when there are a reasonably small number of predecessors.
If there have been a thousand gladiators before you, fine.
If there have been a trillion gladiators before you, someone else has had this idea, and that makes the absence of names on the coliseum ten times more scary.
If there have been a trillion gladiators before you, the conditions have been in place for gladiators to bribe others to put their names on the wall since the beginning of gladiating, and there are still no names on the wall, then you are fundamentally misunderstanding some aspect of the situation. Either people are lying when they say there have been a trillion gladiators before you, or people are lying on a much more fundamental level—for example, the walls of the Coliseum are wiped clear once a year, or this entire scenario is a dream.
If we assume a late filter and set up this scenario, the obvious question becomes “What happened to the last civilization who tried this?”
And the obvious answer is “nothing good”.
Since it seems unlikely that every past civilization collapsed by coincidence just before it could implement this idea, then with certain strong assumptions like “very many civilizations” and “ease of galaxy-wide transmission”, we are left with only the possibility of early filter, or careful enforcement of late filter by alien intelligence.
The reason that this scenario requires such nonsensical decision theory is because it’s based on flawed assumptions—that this state of affairs plus a late filter could ever come about naturally.
I hope this seems like a logical development of what I said in the post you linked.
Let G = the number of civilizations in our galaxy that have reached our level of development.
I agree with you that for sufficiently large values of G we are left with either “careful enforcement of late filter by alien intelligence” or “flawed assumptions” For sufficiently low G we don’t have to fear the great filter.
But for a medium range G (1000 gladiators) we should be very afraid if it, and I think this is the most likely situation since the higher the G, so long as G isn’t so large as to create absurdities (if we assume away the zoo hypothesis and alien exterminators), observers like us are common. What’s needed is some kind of mathematical model that captures the tradeoff between the Fermi paradox getting worse and the anthrorpics making observers such as us more common as G increases.