You’re saying it seems more likely that FTL is possible than that every single civilization wipes itself out. Intuitively, I agree, but it’s hard to be sure.
I’d say it’s not that unilkely that P(doom before K2) > .9999. I know more about AI and alignment than I do physics, and I’d say it’s looking a lot like AGI is surprisingly easy to build once you’ve got the compute (and less of that than we thought), and that coordination is quite difficult. Long-term stable AGI alignment in a selfish and shortsighted species doesn’t seem impossible, but it might be really hard (and I think it’s likely that any species creating AGI will have barely graduated from being animals like we have, so that could well be universal). On the other hand, I haven’t kept up on physics, much less debates on how likely FTL is.
I think there’s another, more likely possibility: other solutions to the Fermi paradox. I don’t remember the author, but there’s an astrophysicist arguing that it’s quite possible we’re the first in our galaxy, based on the frequency of sterilizing nova events, particularly nearer the galactic center. There are a bunch of other galaxies 100,000-1m light years away, which isn’t that far on the timeline of the 14b universe lifespan. But this interacts with the timelines for creating habitable problems, and timelines of nova and supernova events sterilizing most planets frequently enough to prevent intelligent life. Whew.
Hooray, LessWrong for revealing that I don’t understand the Fermi Paradox at all!
Let me just mention my preferred solution, even though I can’t make an argument for its likelihood:
Aliens have visited. And they’re still here, keeping an eye on things. Probably not any of the ones they talk about on Ancient Mysteries (although the current reports from the US military indicates that they believe they’ve observed vehicles we can’t remotely build, and it’s highly unlikely to be a secret US program, or any other world power, so maybe there are some oddly careless aliens buzzing around...)
My proposal is that a civilization that achieves aligned AGI might easily elect to stay dark. No Dyson spheres that can be seen by monkeys, and perhaps more elaborate means to conceal their (largely virtual) civilization. They may fear encountering either a hostile species with its own aligned AGI, or an unaligned AGI. One possible response is to stay hidden, possibly while preparing to fight. It does sound odd if hiding works, because an unaligned AGI should be expanding its paperclipping projects at near light speed anyway, but there are plenty of possible twists to the logic that I haven’t thought through.
That interacts with your premise that K2 civilizations should be easy to spot. I guess it’s a claim that advanced civilizations don’t hit K2, because they prefer to live in virtual worlds, and have little interest in expanding as fast as possible.
Anyway, I should drag my head out of this fun space and go do something more pragmatically useful. I intend to help our odds of survival, even if we’re ultimately doomed based on this anthropic reasoning.
I guess it’s a claim that advanced civilizations don’t hit K2, because they prefer to live in virtual worlds, and have little interest in expanding as fast as possible.
This would be hard. You would need active regulations against designer babies and/or reproduction.
Because, well, suppose 99.9% of your population wants to veg out in the Land of Infinite Fun. The other 0.1% thinks a good use of its time is popping out as many babies as possible. Maybe they can’t make sure their offspring agree with this (hence the mention of regulations against designer babies, although even then natural selection will be selecting at full power for any genes producing a tendency to do this), but they can brute-force through that by having ten thousand babies each—you’ve presumably got immortality if you’ve gotten to this point, so there’s not a lot stopping them. Heck, they could even flee civilisation to escape the persecution and start their own civilisation which rapidly eclipses the original in population and (if the original’s not making maximum use of resources) power.
Giving up on expansion is an exclusive Filter, at the level of civilisations (they all need to do this, because any proportion of expanders will wind up dominating the end-state) but also at the level of individuals (individuals who decide to raise the birth rate of their civilisations can do it unilaterally unless suppressed). Shub-Niggurath always wins by default—it’s possible to subdue her, but you are not going to do it by accident.
(The obvious examples of this in the human case are the Amish and Quiverfulls. The Amish population grows rapidly because it has high fertility and high retention. The Quiverfulls are not currently self-sustaining because they have such low retention that 12 kids/woman isn’t enough to break even, but that will very predictably yield to technology. Unless these are forcibly suppressed, birth rate collapse is not going to make the human race peter out.)
Anyway, I should drag my head out of this fun space and go do something more pragmatically useful. I intend to help our odds of survival, even if we’re ultimately doomed based on this anthropic reasoning.
Yes! Please do! I’m not at all trying to discourage people from fighting the good fight. It’s just, y’know, I noticed it and so I figured I’d mention it.
To your first point:
You’re saying it seems more likely that FTL is possible than that every single civilization wipes itself out. Intuitively, I agree, but it’s hard to be sure.
I’d say it’s not that unilkely that P(doom before K2) > .9999. I know more about AI and alignment than I do physics, and I’d say it’s looking a lot like AGI is surprisingly easy to build once you’ve got the compute (and less of that than we thought), and that coordination is quite difficult. Long-term stable AGI alignment in a selfish and shortsighted species doesn’t seem impossible, but it might be really hard (and I think it’s likely that any species creating AGI will have barely graduated from being animals like we have, so that could well be universal). On the other hand, I haven’t kept up on physics, much less debates on how likely FTL is.
I think there’s another, more likely possibility: other solutions to the Fermi paradox. I don’t remember the author, but there’s an astrophysicist arguing that it’s quite possible we’re the first in our galaxy, based on the frequency of sterilizing nova events, particularly nearer the galactic center. There are a bunch of other galaxies 100,000-1m light years away, which isn’t that far on the timeline of the 14b universe lifespan. But this interacts with the timelines for creating habitable problems, and timelines of nova and supernova events sterilizing most planets frequently enough to prevent intelligent life. Whew.
Hooray, LessWrong for revealing that I don’t understand the Fermi Paradox at all!
Let me just mention my preferred solution, even though I can’t make an argument for its likelihood:
Aliens have visited. And they’re still here, keeping an eye on things. Probably not any of the ones they talk about on Ancient Mysteries (although the current reports from the US military indicates that they believe they’ve observed vehicles we can’t remotely build, and it’s highly unlikely to be a secret US program, or any other world power, so maybe there are some oddly careless aliens buzzing around...)
My proposal is that a civilization that achieves aligned AGI might easily elect to stay dark. No Dyson spheres that can be seen by monkeys, and perhaps more elaborate means to conceal their (largely virtual) civilization. They may fear encountering either a hostile species with its own aligned AGI, or an unaligned AGI. One possible response is to stay hidden, possibly while preparing to fight. It does sound odd if hiding works, because an unaligned AGI should be expanding its paperclipping projects at near light speed anyway, but there are plenty of possible twists to the logic that I haven’t thought through.
That interacts with your premise that K2 civilizations should be easy to spot. I guess it’s a claim that advanced civilizations don’t hit K2, because they prefer to live in virtual worlds, and have little interest in expanding as fast as possible.
Anyway, I should drag my head out of this fun space and go do something more pragmatically useful. I intend to help our odds of survival, even if we’re ultimately doomed based on this anthropic reasoning.
This would be hard. You would need active regulations against designer babies and/or reproduction.
Because, well, suppose 99.9% of your population wants to veg out in the Land of Infinite Fun. The other 0.1% thinks a good use of its time is popping out as many babies as possible. Maybe they can’t make sure their offspring agree with this (hence the mention of regulations against designer babies, although even then natural selection will be selecting at full power for any genes producing a tendency to do this), but they can brute-force through that by having ten thousand babies each—you’ve presumably got immortality if you’ve gotten to this point, so there’s not a lot stopping them. Heck, they could even flee civilisation to escape the persecution and start their own civilisation which rapidly eclipses the original in population and (if the original’s not making maximum use of resources) power.
Giving up on expansion is an exclusive Filter, at the level of civilisations (they all need to do this, because any proportion of expanders will wind up dominating the end-state) but also at the level of individuals (individuals who decide to raise the birth rate of their civilisations can do it unilaterally unless suppressed). Shub-Niggurath always wins by default—it’s possible to subdue her, but you are not going to do it by accident.
(The obvious examples of this in the human case are the Amish and Quiverfulls. The Amish population grows rapidly because it has high fertility and high retention. The Quiverfulls are not currently self-sustaining because they have such low retention that 12 kids/woman isn’t enough to break even, but that will very predictably yield to technology. Unless these are forcibly suppressed, birth rate collapse is not going to make the human race peter out.)
Yes! Please do! I’m not at all trying to discourage people from fighting the good fight. It’s just, y’know, I noticed it and so I figured I’d mention it.