I am not a physicist, so I didn’t and couldn’t do the calculations, but I don’t really believe that classic probes can reach .999c. They would be pulverised by intergalactic material. Even worse, literal .999c would not be fast enough for this fancy “hits us before we know it” filter idea to work. As I explained in some of the above-quoted threads, my bet would definitely be on the things you called “spaceships out of light”. A sufficiently advanced civilisation might switch from atoms to photons as their substrate. The only resource they would extract from the volume of space they consume would be negentropy, so they wouldn’t need any slowing down or seeds. Again, I am not a physicist. I discussed this with some physicists, and they were sceptic, but their objections seemed to be of the engineering kind, not theoretic kind, and I’m not sure they sufficiently internalized “don’t bet against the engineering skill of a superintelligence”.
For me, one source of inspiration for this light-speed expansion idea was Stanislav Lem’s “His Master’s Voice”, where precisely tuned radio waves are used to catalyse the formation of DNA-based life on distant planets. (Obviously that’s way too slow for the purposes we discuss here.)
Photons can’t interact with each other (by the linearity of Maxwell’s equations) and so can’t form a computational substrate on their own. This doesn’t rule out “no atoms” computing in general though.
EDIT: I’m wrong. When you do the calculations in full quantum field theory there is a (extremely) slight interaction (due to creations and destructions of electron-postitron pairs, which in some sense destroy the linearity). I don’t know if this is enough to support computers.
They would be pulverised by intergalactic material.
That’s actually concerning. Maybe it isn’t possible to shoot matter intact across the galaxy… Would have to do the calculations with interstellar particle density.
Also, surely you mean “interstellar”? I was only thinking of interstellar travel for now; assuming intergalactic is impossible or whatever.
Even worse, literal .999c would not be fast enough for this fancy “hits us before we know it” filter idea to work.
Not for intergalactic, but the galaxy is 100k lightyears across. 0.999c would get you a lag behind the light of 100 years, which is on the same order of magnitude as the time between detectability and singularity (looks like < 200 years for us).
A sufficiently advanced civilisation might switch from atoms to photons as their substrate. The only resource they would extract from the volume of space they consume would be negentropy, so they wouldn’t need any slowing down or seeds.
How would one eat a star without slowing down, even in principle?
precisely tuned radio waves are used to catalyse the formation of DNA-based life on distant planets. (Obviously that’s way too slow for the purposes we discuss here.)
This is closer to what I was thinking, but of course if you can catalyze DNA, you can catalyze arbitrary nanomachines. Exactly how this would work is a mystery to me… (also, doing it with radio waves is needlessly difficult, surely you’d use something precise and ionizing like UV, Xrays, or gamma)
Also, surely you mean “interstellar”? I was only thinking of interstellar travel for now; assuming intergalactic is impossible or whatever.
When you look at it from a Fermi paradox perspective, you have to be able to account for many hundred million years of expansion, because there can be many civilizations that are that much older than us. We are talking about some crazy thing that is supposed to be able to consume a galaxy with almost-optimal speed. I don’t expect galaxy boundaries to stop it completely, neither by intention nor by necessity. I am not even sure that it has to treat intergalactic space as the long boring travel between the rare interesting parts. Maybe all it really needs is empty space.
0.999c would get you a lag behind the light of 100 years, which is on the same order of magnitude as the time between detectability and singularity (looks like < 200 years for us).
Interesting point.
How would one eat a star without slowing down, even in principle?
Note that I speculated about photons as a substrate. Maybe major reorganization of atoms in unnecessary, and it can just fill the space around the star, and utilize the star as a photon source.
I am not a physicist, so I didn’t and couldn’t do the calculations, but I don’t really believe that classic probes can reach .999c. They would be pulverised by intergalactic material. Even worse, literal .999c would not be fast enough for this fancy “hits us before we know it” filter idea to work. As I explained in some of the above-quoted threads, my bet would definitely be on the things you called “spaceships out of light”. A sufficiently advanced civilisation might switch from atoms to photons as their substrate. The only resource they would extract from the volume of space they consume would be negentropy, so they wouldn’t need any slowing down or seeds. Again, I am not a physicist. I discussed this with some physicists, and they were sceptic, but their objections seemed to be of the engineering kind, not theoretic kind, and I’m not sure they sufficiently internalized “don’t bet against the engineering skill of a superintelligence”.
For me, one source of inspiration for this light-speed expansion idea was Stanislav Lem’s “His Master’s Voice”, where precisely tuned radio waves are used to catalyse the formation of DNA-based life on distant planets. (Obviously that’s way too slow for the purposes we discuss here.)
Photons can’t interact with each other (by the linearity of Maxwell’s equations) and so can’t form a computational substrate on their own. This doesn’t rule out “no atoms” computing in general though.
EDIT: I’m wrong. When you do the calculations in full quantum field theory there is a (extremely) slight interaction (due to creations and destructions of electron-postitron pairs, which in some sense destroy the linearity). I don’t know if this is enough to support computers.
That’s actually concerning. Maybe it isn’t possible to shoot matter intact across the galaxy… Would have to do the calculations with interstellar particle density.
Also, surely you mean “interstellar”? I was only thinking of interstellar travel for now; assuming intergalactic is impossible or whatever.
Not for intergalactic, but the galaxy is 100k lightyears across. 0.999c would get you a lag behind the light of 100 years, which is on the same order of magnitude as the time between detectability and singularity (looks like < 200 years for us).
How would one eat a star without slowing down, even in principle?
This is closer to what I was thinking, but of course if you can catalyze DNA, you can catalyze arbitrary nanomachines. Exactly how this would work is a mystery to me… (also, doing it with radio waves is needlessly difficult, surely you’d use something precise and ionizing like UV, Xrays, or gamma)
When you look at it from a Fermi paradox perspective, you have to be able to account for many hundred million years of expansion, because there can be many civilizations that are that much older than us. We are talking about some crazy thing that is supposed to be able to consume a galaxy with almost-optimal speed. I don’t expect galaxy boundaries to stop it completely, neither by intention nor by necessity. I am not even sure that it has to treat intergalactic space as the long boring travel between the rare interesting parts. Maybe all it really needs is empty space.
Interesting point.
Note that I speculated about photons as a substrate. Maybe major reorganization of atoms in unnecessary, and it can just fill the space around the star, and utilize the star as a photon source.