What I had in mind were the two largest traps: societies which maintained breathing space being overrun by societies which ruthlessly optimized to overrun other societies, and our entire planet being overrun by more efficient extraterrestrial intelligences which ruthlessly optimized for ability to expand through the universe.
I agree that for more mundane cases like dangerous consumer products and political parties, there’ll probably be some “fences on the various slopes”. But they will be cold comfort indeed if we get wiped out by Malthusian limit-embracing aliens in a century’s time!
But it occurs to me that ruthlessly efficient societies need to be highly coordinated societies, which may push in other directions; I wonder if there’s something worth digging into there...
Another hopeful thought: we might escape being eaten for an unexpectedly long time because evolution is stupid. It might consistently program organic life to maximize for proxies of reproductive success like social status, long life, or ready access to food, rather than the ability to tile the universe with copies of itself.
This in no way implies humanity’s safe forever; evolution would almost surely blunder into creating a copy-maximizing species eventually, by sheer random accident if nothing else. But humanity’s window of safety might be millions or billions or trillions of years rather than millennia.
What I had in mind were the two largest traps: societies which maintained breathing space being overrun by societies which ruthlessly optimized to overrun other societies, and our entire planet being overrun by more efficient extraterrestrial intelligences which ruthlessly optimized for ability to expand through the universe.
I agree that for more mundane cases like dangerous consumer products and political parties, there’ll probably be some “fences on the various slopes”. But they will be cold comfort indeed if we get wiped out by Malthusian limit-embracing aliens in a century’s time!
I take your point.
But it occurs to me that ruthlessly efficient societies need to be highly coordinated societies, which may push in other directions; I wonder if there’s something worth digging into there...
Another hopeful thought: we might escape being eaten for an unexpectedly long time because evolution is stupid. It might consistently program organic life to maximize for proxies of reproductive success like social status, long life, or ready access to food, rather than the ability to tile the universe with copies of itself.
This in no way implies humanity’s safe forever; evolution would almost surely blunder into creating a copy-maximizing species eventually, by sheer random accident if nothing else. But humanity’s window of safety might be millions or billions or trillions of years rather than millennia.