The recent talk about alien constructs and so Dyson spheres got me wondering.
Assuming their existence, why do we expect to see Dyson spheres in other star systems? A new Dyson sphere (that is, the star + Dyson sphere system) would not emit much anything and so would be invisible. Of course, the energy has to go somewhere and even superadvanced aliens—assuming they haven’t developed all new superadvanced physics—will have a lot of waste heat. That heat, we expect, would be dumped into surrounding space as some sort of radiation and so we would see it.
That leads me to two stupid questions (note that we are talking at the star-system scale):
Can you dump waste heat directionally? If you built a Dyson sphere and became invisible at interstellar distances, can you radiate your heat signature as a beam and continue to avoid being seen?
If you have a handy black hole around, can you dump the waste heat into a black hole? What that would look like?
P.S. If the Dyson sphere functions only as an energy source it would be invisible, seems to me. Imagine a scenario where aliens come to a star system, build a Dyson sphere around the star, and then arrange for all that energy to be narrow-beamed to neighbouring star systems where it will be collected and used. The point is that the energy is used elsewhere, so the Dysoned star emits very little and unless you’re in the beam you don’t see it. Would that work?
Not sure it makes sense thermodynamically to deal with waste heat that way (if you are transmitting “waste heat” in a narrow beam, you are basically just transmitting energy in a narrow beam, and so it’s not waste heat anymore—you can get useful work out of it).
edit: I suppose the question is: what % of the star’s outgoing energy can we harness in principle, such that waste heat is hard to tell apart from background, and we completely hide that the star is there. For example, in the limit, if you just used a little of the star’s energy to redirect all the rest into a black hole, will the waste heat just from the redirection effort be detectable? If so, we can’t hide a star, the best we can do is not use too much energy, so the star looks like a normal star with no life in the system (but evil aliens can still come and check it out, since they know a star is there). If not, maybe we can harness some bigger % on the way into the black hole. If so, what % is physically possible? I don’t know.
I feel like physicists already worked out that you can’t hide stars, but I don’t know the literature.
I suppose the question is: what % of the star’s outgoing energy can we harness in principle, such that waste heat is hard to tell apart from background
You can never have the temperature of outgoing radiation indistinguishable from the cosmic background, since energy is being generated by the star and in equilibrium more energy must leave than enters from the background.
The CMB reads as ~2.76k. Let’s say you wanted to radiate the entire energy output of a star at 3.76 k. That means the flux out equals the star’s flux plus the CMB flux in. For a star like the sun, the surface area of material required comes to a sphere a fifth of a light year (13000 AUs) in radius to dissipate a solar luminosity of energy (divide quantity of radiator material by the fraction of the solar luminosity you want to use, but keep in mind having the radiators closer than a 5th of a light year would probably be pointless since they’d be heated up by the sun) (also keep in mind such an object would look as large as the full moon 44 light years away and as wide as Jupiter in our sky 2,000 light years away). For ten kelvin, a sphere 0.025 light years or 1560 AUs in radius. For 50 kelvin, 62 AUs (twice as large as Neptune’s orbit).
Of course, there’s also the starlight flux of all the other nearby stars, which makes this worse for very low temperatures.
(Calculations done using energy out = energy in from CMB + solar flux, and the definition of blackbody radiation)
EDIT: I should go over some astronomy papers and figure out what amounts of material typically produce observable infrared excesses.
If you dumped waste heat directionally it would act as a photon rocket.
If you somehow reflected all the light of a star in one direction, the momentum change in the light would produce thrust on the reflector (much less than is needed to move a star), and since no substance is 100% reflective the huge amount of surface area would nonetheless heat up and be visible in the infrared due to its vast surface area. The same is true of any substance directing waste thermal radiation—you can shunt most of it in some direction the material will definitely reach an equilibrium temperature warmer than the cosmic microwave background anyway.
Furthermore, all the energy would eventually get absorbed/used SOMEWHERE and become heat. Local conversion of energy to other forms would also always involve waste heat production.
As far as I am aware, black hole surface gravity is analogous to temperature and surface area is analogous to entropy when you start digging into their thermodynamics. Dumping waste thermal radiation in will make them bigger and they will eventually re-radiate thermal radiation via the Hawking mechanism in the far future.
As far as I am aware, black hole surface gravity is analogous to temperature and surface area is analogous to entropy when you start digging into their thermodynamics. Dumping waste thermal radiation in will make them bigger and they will eventually re-radiate thermal radiation via the Hawking mechanism in the far future.
Black holes have negative specific heat, i.e., dumping energy into them makes them larger, hence colder. In particular, a black hole whose temperature is colder then the microwave background will just keep absorbing energy, and hence get even colder.
Indeed! Their surface gravity (temperature) is inversely proportional to mass and their surface area (entropy) is proportional to the square of mass.
In a non-expanding universe where the background radiation is not being perpetually redshifted to oblivion, over timescales that are so large they make the age of our universe infinitesimal eventually whatever emitted the radiation would cool down and eventually all the radiation would be absorbed by the holes, cooling the background. They would then emit radiation, and holes smaller (hotter) than the average temperature of the new background emitted by the holes would evaporate while those larger (cooler) would grow. I’m not sure if in all cases this eventually leads to complete evaporation of holes as the average size rises until you only have one huge hole which has to evaporate, or if in a universe of non-expanding size you could reach an equilibrium of mass/energy/heat evenly distributed in an unchanging way between low temperature radiation and huge black holes.
In our expanding universe, not only does the primordial background perpetually redshift approaching absolute zero over time, any radiation emitted from black holes at cosmological distances apart redshifts to oblivion the further apart they are. Therefore, in an expanding universe eventually the background falls below the black hole temperature and they will eventually be hotter than the background, emitting their entire mass via Hawking radiation back into the universe.
No it doesn’t. Photons travel until they hit something in straight lines. The microwave background comes uniformly from all directions behind stars and galaxies.
If you have a handy black hole around, can you dump the waste heat into a black hole? What that would look like?
More-or-less like a black hole, gravitational signature but no light emitted.
Imagine a scenario where aliens come to a star system, build a Dyson sphere around the star, and then arrange for all that energy to be narrow-beamed to neighbouring star systems where it will be collected and used.
That doesn’t work. The emitted energy must have higher entropy content than the initial energy of the star, so it would be less useful to the destination star. In fact, if the original destination star can extract useful work from it, so could the original star.
so it would be less useful to the destination star.
Sure, but I don’t see a problem here. If you have a particular attachment to a star system, you may want to supply it with energy from other stars instead of moving to them.
The point is that since the energy will get “consumed” elsewhere, the waste heat at the origin will be minimal and so the Dyson sphere will remain relatively cold (though higher than the background temperature, to be sure).
The recent talk about alien constructs and so Dyson spheres got me wondering.
Assuming their existence, why do we expect to see Dyson spheres in other star systems? A new Dyson sphere (that is, the star + Dyson sphere system) would not emit much anything and so would be invisible. Of course, the energy has to go somewhere and even superadvanced aliens—assuming they haven’t developed all new superadvanced physics—will have a lot of waste heat. That heat, we expect, would be dumped into surrounding space as some sort of radiation and so we would see it.
That leads me to two stupid questions (note that we are talking at the star-system scale):
Can you dump waste heat directionally? If you built a Dyson sphere and became invisible at interstellar distances, can you radiate your heat signature as a beam and continue to avoid being seen?
If you have a handy black hole around, can you dump the waste heat into a black hole? What that would look like?
P.S. If the Dyson sphere functions only as an energy source it would be invisible, seems to me. Imagine a scenario where aliens come to a star system, build a Dyson sphere around the star, and then arrange for all that energy to be narrow-beamed to neighbouring star systems where it will be collected and used. The point is that the energy is used elsewhere, so the Dysoned star emits very little and unless you’re in the beam you don’t see it. Would that work?
Not sure it makes sense thermodynamically to deal with waste heat that way (if you are transmitting “waste heat” in a narrow beam, you are basically just transmitting energy in a narrow beam, and so it’s not waste heat anymore—you can get useful work out of it).
edit: I suppose the question is: what % of the star’s outgoing energy can we harness in principle, such that waste heat is hard to tell apart from background, and we completely hide that the star is there. For example, in the limit, if you just used a little of the star’s energy to redirect all the rest into a black hole, will the waste heat just from the redirection effort be detectable? If so, we can’t hide a star, the best we can do is not use too much energy, so the star looks like a normal star with no life in the system (but evil aliens can still come and check it out, since they know a star is there). If not, maybe we can harness some bigger % on the way into the black hole. If so, what % is physically possible? I don’t know.
I feel like physicists already worked out that you can’t hide stars, but I don’t know the literature.
You can never have the temperature of outgoing radiation indistinguishable from the cosmic background, since energy is being generated by the star and in equilibrium more energy must leave than enters from the background.
The CMB reads as ~2.76k. Let’s say you wanted to radiate the entire energy output of a star at 3.76 k. That means the flux out equals the star’s flux plus the CMB flux in. For a star like the sun, the surface area of material required comes to a sphere a fifth of a light year (13000 AUs) in radius to dissipate a solar luminosity of energy (divide quantity of radiator material by the fraction of the solar luminosity you want to use, but keep in mind having the radiators closer than a 5th of a light year would probably be pointless since they’d be heated up by the sun) (also keep in mind such an object would look as large as the full moon 44 light years away and as wide as Jupiter in our sky 2,000 light years away). For ten kelvin, a sphere 0.025 light years or 1560 AUs in radius. For 50 kelvin, 62 AUs (twice as large as Neptune’s orbit).
Of course, there’s also the starlight flux of all the other nearby stars, which makes this worse for very low temperatures.
(Calculations done using energy out = energy in from CMB + solar flux, and the definition of blackbody radiation)
EDIT: I should go over some astronomy papers and figure out what amounts of material typically produce observable infrared excesses.
What does it mean to hide a star? Would it not be ‘visible’ by having gravity?
That explains dark matter — vast alien civilisations that leak nothing but gravity. And the microwave background.
No it doesn’t. Microwave background intensity is uncorrelated with imputed dark matter density in a given direction.
If you dumped waste heat directionally it would act as a photon rocket.
If you somehow reflected all the light of a star in one direction, the momentum change in the light would produce thrust on the reflector (much less than is needed to move a star), and since no substance is 100% reflective the huge amount of surface area would nonetheless heat up and be visible in the infrared due to its vast surface area. The same is true of any substance directing waste thermal radiation—you can shunt most of it in some direction the material will definitely reach an equilibrium temperature warmer than the cosmic microwave background anyway.
Furthermore, all the energy would eventually get absorbed/used SOMEWHERE and become heat. Local conversion of energy to other forms would also always involve waste heat production.
As far as I am aware, black hole surface gravity is analogous to temperature and surface area is analogous to entropy when you start digging into their thermodynamics. Dumping waste thermal radiation in will make them bigger and they will eventually re-radiate thermal radiation via the Hawking mechanism in the far future.
Black holes have negative specific heat, i.e., dumping energy into them makes them larger, hence colder. In particular, a black hole whose temperature is colder then the microwave background will just keep absorbing energy, and hence get even colder.
Indeed! Their surface gravity (temperature) is inversely proportional to mass and their surface area (entropy) is proportional to the square of mass.
In a non-expanding universe where the background radiation is not being perpetually redshifted to oblivion, over timescales that are so large they make the age of our universe infinitesimal eventually whatever emitted the radiation would cool down and eventually all the radiation would be absorbed by the holes, cooling the background. They would then emit radiation, and holes smaller (hotter) than the average temperature of the new background emitted by the holes would evaporate while those larger (cooler) would grow. I’m not sure if in all cases this eventually leads to complete evaporation of holes as the average size rises until you only have one huge hole which has to evaporate, or if in a universe of non-expanding size you could reach an equilibrium of mass/energy/heat evenly distributed in an unchanging way between low temperature radiation and huge black holes.
In our expanding universe, not only does the primordial background perpetually redshift approaching absolute zero over time, any radiation emitted from black holes at cosmological distances apart redshifts to oblivion the further apart they are. Therefore, in an expanding universe eventually the background falls below the black hole temperature and they will eventually be hotter than the background, emitting their entire mass via Hawking radiation back into the universe.
Unless you’re spending some of the energy the Dyson sphere is collecting to actively cool that surface area.
Sure, but if you point it at another galaxy, it will take a very very long time for something to happen (and in a galaxy far, far away, too).
That explains the microwave background!
No it doesn’t. Photons travel until they hit something in straight lines. The microwave background comes uniformly from all directions behind stars and galaxies.
More-or-less like a black hole, gravitational signature but no light emitted.
That doesn’t work. The emitted energy must have higher entropy content than the initial energy of the star, so it would be less useful to the destination star. In fact, if the original destination star can extract useful work from it, so could the original star.
Sure, but I don’t see a problem here. If you have a particular attachment to a star system, you may want to supply it with energy from other stars instead of moving to them.
The point is that since the energy will get “consumed” elsewhere, the waste heat at the origin will be minimal and so the Dyson sphere will remain relatively cold (though higher than the background temperature, to be sure).
However, the destination star would emit large amounts of heat, i.e., infrared.