My own suspicion is that the bulk of the Great Filter is behind us. We’ve awoken into a fairly old universe. (Young in terms of total lifespan, but old in terms of maximally life-sustaining years.) If intelligent agents evolve easily but die out fast, we should expect to see a young universe.
We can also consider the possibility of stronger anthropic effects. Suppose intelligent species always succeed in building AGIs that propagate outward at approximately the speed of light, converting all life-sustaining energy into objects or agents outside our anthropic reference class. Then any particular intelligent species Z will observe a Fermi paradox no matter how common or rare intelligent species are, because if any other high-technology species had arisen first in Z’s past light cone it would have prevented the existence of anything Z-like. (However, species in this scenario will observe much younger universes the smaller a Past Filter there is.)
So grey goo creates an actual Future Filter by killing their creators, but hyper-efficient hungry AGI creates an anthropic illusion of a Future Filter by devouring everything in their observable universe except the creator species. (And possibly devouring the creator species too; that’s unclear. Evolved alien values are less likely to eat the universe than artificial unFriendly-relative-to-alien-values values are, but perhaps not dramatically less likely; and unFriendly-relative-to-creator AI is almost certainly more common than Friendly-relative-to-creator AI.)
Once everything got transformed into resident von Neumann machines, evolution amongst those copies would probably occur at some point, until eventually there may be new macroorganisms organized from self-replicating building blocks, which may again show significant agency and turn their gaze towards the stars.
Probably won’t happen before the heat death of the universe. The scariest thing about nanodevices is that they don’t evolve. A universe ruled by nanodevices is plausibly even worse (relative to human values) than one ruled by uFAI like Clippy, because it’s vastly less interesting.
(Not because paperclips are better than nanites, but because there’s at least one sophisticated mind to be found.)
My own suspicion is that the bulk of the Great Filter is behind us. We’ve awoken into a fairly old universe. (Young in terms of total lifespan, but old in terms of maximally life-sustaining years.) If intelligent agents evolve easily but die out fast, we should expect to see a young universe.
We can also consider the possibility of stronger anthropic effects. Suppose intelligent species always succeed in building AGIs that propagate outward at approximately the speed of light, converting all life-sustaining energy into objects or agents outside our anthropic reference class. Then any particular intelligent species Z will observe a Fermi paradox no matter how common or rare intelligent species are, because if any other high-technology species had arisen first in Z’s past light cone it would have prevented the existence of anything Z-like. (However, species in this scenario will observe much younger universes the smaller a Past Filter there is.)
So grey goo creates an actual Future Filter by killing their creators, but hyper-efficient hungry AGI creates an anthropic illusion of a Future Filter by devouring everything in their observable universe except the creator species. (And possibly devouring the creator species too; that’s unclear. Evolved alien values are less likely to eat the universe than artificial unFriendly-relative-to-alien-values values are, but perhaps not dramatically less likely; and unFriendly-relative-to-creator AI is almost certainly more common than Friendly-relative-to-creator AI.)
Probably won’t happen before the heat death of the universe. The scariest thing about nanodevices is that they don’t evolve. A universe ruled by nanodevices is plausibly even worse (relative to human values) than one ruled by uFAI like Clippy, because it’s vastly less interesting.
(Not because paperclips are better than nanites, but because there’s at least one sophisticated mind to be found.)