What could the alien civilizations do? Suppose SETI decoded “Hi from the Andromeda Galaxy! BTW, nanobots might consume your planet in 23 years, so consider fleeing for your lives.” Is there anything humans could do?
The costs might be high. Suppose our message saves an alien civilization one thousand light-years away, but delays a positive singularity by three days. By the time our colonizers reach the alien planet, the opportunity cost would be a three-light-day deep shell of a thousand light-year sphere. Most of the volume of a sphere is close to the surface, so this cost is enormous. Giving the aliens an escape ark when we colonize their planet would be quintillions of times less expensive. Of course, a paperclipper would do no such thing.
It may be presumptuous to warn about AI. Perhaps the correct message to say is something like “If you think of a clever experiment to measure dark energy density, don’t do it.”
It depends on your stage of development. You might build a defense, flee at close to the speed of light and take advantage of the universe’s expansion to get into a separate Hubble volume from mankind, accelerate your AI program, or prepare for the possibility of annihilation.
Good point, and the resources we put into signaling could instead be used to research friendly AI.
The warming should be honest and give our best estimates.
I love this idea! A few thoughts:
What could the alien civilizations do? Suppose SETI decoded “Hi from the Andromeda Galaxy! BTW, nanobots might consume your planet in 23 years, so consider fleeing for your lives.” Is there anything humans could do?
The costs might be high. Suppose our message saves an alien civilization one thousand light-years away, but delays a positive singularity by three days. By the time our colonizers reach the alien planet, the opportunity cost would be a three-light-day deep shell of a thousand light-year sphere. Most of the volume of a sphere is close to the surface, so this cost is enormous. Giving the aliens an escape ark when we colonize their planet would be quintillions of times less expensive. Of course, a paperclipper would do no such thing.
It may be presumptuous to warn about AI. Perhaps the correct message to say is something like “If you think of a clever experiment to measure dark energy density, don’t do it.”
It depends on your stage of development. You might build a defense, flee at close to the speed of light and take advantage of the universe’s expansion to get into a separate Hubble volume from mankind, accelerate your AI program, or prepare for the possibility of annihilation.
Good point, and the resources we put into signaling could instead be used to research friendly AI.
The warming should be honest and give our best estimates.
Quite.
The outer thee days of a 1000 Ly sphere account for 0.0025% of its volume.