Robin’s use of the Great Filter argument relies on the SIA, which (if one buys it) allows one to rule out a priori the possibility that the development of beings like us is very rare. Absent that, if one’s prior for the development of life is flatter than for things like nuclear war (it would be much less surprising for less than one in 10^100 planets to evolve intelligent life than for less than 1 in 10^100 civilizations like ours to avoid self-destruction with advanced technology) then you get much less update in favor of future filters.
OTOH, the SIA also strongly supports the possibility that we’re a simulation (if we assign a 1 in 1 million probability to sims being billions of times more numerous, than we should assign more credence to that than to being in the basement), which warps the Great Filter argument into something almost unrecognizable. See this paper for a discussion of the interactions with SIA.
Robin’s use of the Great Filter argument relies on the SIA, which (if one buys it) allows one to rule out a priori the possibility that the development of beings like us is very rare. Absent that, if one’s prior for the development of life is flatter than for things like nuclear war (it would be much less surprising for less than one in 10^100 planets to evolve intelligent life than for less than 1 in 10^100 civilizations like ours to avoid self-destruction with advanced technology) then you get much less update in favor of future filters.
OTOH, the SIA also strongly supports the possibility that we’re a simulation (if we assign a 1 in 1 million probability to sims being billions of times more numerous, than we should assign more credence to that than to being in the basement), which warps the Great Filter argument into something almost unrecognizable. See this paper for a discussion of the interactions with SIA.