As far as I am aware, black hole surface gravity is analogous to temperature and surface area is analogous to entropy when you start digging into their thermodynamics. Dumping waste thermal radiation in will make them bigger and they will eventually re-radiate thermal radiation via the Hawking mechanism in the far future.
Black holes have negative specific heat, i.e., dumping energy into them makes them larger, hence colder. In particular, a black hole whose temperature is colder then the microwave background will just keep absorbing energy, and hence get even colder.
Indeed! Their surface gravity (temperature) is inversely proportional to mass and their surface area (entropy) is proportional to the square of mass.
In a non-expanding universe where the background radiation is not being perpetually redshifted to oblivion, over timescales that are so large they make the age of our universe infinitesimal eventually whatever emitted the radiation would cool down and eventually all the radiation would be absorbed by the holes, cooling the background. They would then emit radiation, and holes smaller (hotter) than the average temperature of the new background emitted by the holes would evaporate while those larger (cooler) would grow. I’m not sure if in all cases this eventually leads to complete evaporation of holes as the average size rises until you only have one huge hole which has to evaporate, or if in a universe of non-expanding size you could reach an equilibrium of mass/energy/heat evenly distributed in an unchanging way between low temperature radiation and huge black holes.
In our expanding universe, not only does the primordial background perpetually redshift approaching absolute zero over time, any radiation emitted from black holes at cosmological distances apart redshifts to oblivion the further apart they are. Therefore, in an expanding universe eventually the background falls below the black hole temperature and they will eventually be hotter than the background, emitting their entire mass via Hawking radiation back into the universe.
Black holes have negative specific heat, i.e., dumping energy into them makes them larger, hence colder. In particular, a black hole whose temperature is colder then the microwave background will just keep absorbing energy, and hence get even colder.
Indeed! Their surface gravity (temperature) is inversely proportional to mass and their surface area (entropy) is proportional to the square of mass.
In a non-expanding universe where the background radiation is not being perpetually redshifted to oblivion, over timescales that are so large they make the age of our universe infinitesimal eventually whatever emitted the radiation would cool down and eventually all the radiation would be absorbed by the holes, cooling the background. They would then emit radiation, and holes smaller (hotter) than the average temperature of the new background emitted by the holes would evaporate while those larger (cooler) would grow. I’m not sure if in all cases this eventually leads to complete evaporation of holes as the average size rises until you only have one huge hole which has to evaporate, or if in a universe of non-expanding size you could reach an equilibrium of mass/energy/heat evenly distributed in an unchanging way between low temperature radiation and huge black holes.
In our expanding universe, not only does the primordial background perpetually redshift approaching absolute zero over time, any radiation emitted from black holes at cosmological distances apart redshifts to oblivion the further apart they are. Therefore, in an expanding universe eventually the background falls below the black hole temperature and they will eventually be hotter than the background, emitting their entire mass via Hawking radiation back into the universe.