1:10^12 odds against the notion, easily. About as likely as the earth being flat.
Dark matter does not interact locally with itself or visible matter. If it did, it would experience friction (like interstellar gas, dust and stars) and form into disk shapes when spiral galaxies form into disk shapes. A key observation of dark matter is that spiral galaxies’ rotational velocity behaves as one would expect from an ellipsoid.
The fraction of matter that is dark does not change over time, nor does the total mass of objects in the universe. Sky surveys do not find more visible matter further back in time.
The fraction of matter that is dark does not change across space, even across distances that have not been bridgable since the inflation period of the big bang. All surveys show spherical symmetry.
By the laws of thermodynamics, computation requires work. High-entropy energy needs to be converted into low-entropy energy, such as heat. We do not see dark matter absorb or emit energy.
I can imagine no situation where something that is a required part of computational processes could ever present itself to us as dark matter, and no mistake in physics thorough enough to allow it.
How did you get this figure? Two one-in-a-million implausibilities?
computation requires work
Quantum computers are close to reversible. Each halo could be a big quantum coherent structure, with e.g. neutrinos as ancillary qubits. The baryonic world might be where the waste information gets dumped. :-)
Before learning about reversible computation only requiring work when bits are deleted I would have treated each of my points as roughly independent with about 10^1.5 , 10^4 , 10^4 , 10^2.5 odds against respectively. The last point is now down to 10^1.5 .
Dumping waste information in the baryonic world would be visible.
Dumping waste information in the baryonic world would be visible.
Not if the rate is low enough and/or astronomically localized enough.
It would be interesting to make a model in which fuzzy dark matter is coupled to neutrinos, in a way that maximizes rate of quantum information transfer, while remaining within empirical bounds.
Contra #1: Imagine you order a huge stack of computers for massive multiplayers game purpose. Would you expect it might collapse under it’s own weight, or would you expect the builders to be cautious enough that it won’t collapse like passive dust in free fall?
Contra #4: nope. Landauer’s principle implicates that reversible computation cost nothing (until you’d want to read the result, which then cost next to nothing time the size of the result you want to read, irrespective of the size of the computation proper). Present day computers are obviously very far from this limit, but you can’t assume « computronium » is too.
#2 and #3 sounds stronger, imo. Could you provide a glimpse of the confidence intervals and how it varies from one survey to the next?
#1 - Caution doesn’t solve problems, it finds solutions if they exist. You can’t use caution to ignore air resistance when building a rocket. (Though collapse is not necessarily expected—there’s plenty of interstellar dust).
#4 - I didn’t know about Landauer’s principle, though going by what I’m reading, you’re mistaken on its interpretation—it takes ‘next to nothing’ times the part of the computation you throw out, not the part you read out, where the part you throw out increases proportional to the negentropy you’re getting. No free lunch, still, but one whose price is deferable to the moment you run out of storage space.
That would make it possible for dark matter to be part of a computation that hasn’t been read out yet, though not necessarily a major part: I’m not sure the below reasoning is correct, but The Landauer limit with the current 2.7K universe as heat bath is 0.16 meV per bit. This means that the ‘free’ computational cycle you get from the fact that you only need to pay at the end would, to a maximally efficient builder, reward them with 0.16 meV extra for every piece of matter that can hold one bit. We don’t yet have a lower bound for the neutrino mass, but the upper bound is 120 meV. If the upper bound is true, that would mean you would have to cram 10^3 bits in a neutrino before using it as storage nets you more than burning it for energy (by chucking it into an evaporating black hole).
I don’t have data for #2 and #3 at hand. It’s the scientific consensus, for what that’s worth.
1-3: You are certainly right that cold and homogenous black matter is the scientific consensus right now (at least if by consensus we mean « most experts would either think that’s true or admit there is no data strong enough to convince most experts it’s wrong »).
The point I’m trying to make is: as soon as we say « computronium » we are outside of normal science. In normal science, you don’t suppose matter can choose to deploy itself like a solar sail and use that to progressively reach outside regions of the galaxy where dangerous SN are less frequent. You suppose if it exists it has no aim, then find the best non-weird model that fits the data.
In other words, I don’t think we can assume that the scientific consensus is automatically 10^4 or 10^8 strong evidence for « how sure are we that black matters is not a kind of matter that astrophysicist usually don’t botter to consider? », especially when the scientific consensus also includes « we need to keep spending ressources on figuring out what black matter is ». You do agree that’s also the scientific consensus, right? (And not just to keep labs open, but really to add data and visit and revisit new and old models because we’re still not sure what it is)
4: in the theory of purely reversible computation, the size of what you read dictates the size you must throw out. Your computation is however more sounded than the theory of pure reversible computation, because pure reversible computation may well be as impossible as perfectly analog computation. Now, suppose all black matters emits 0,16 mev/bit. How much computation per second and kilo would let the thermal radiation largely below our ability to detect it?
Contra #4: nope. Landauer’s principle implicates that reversible computation cost nothing (until you’d want to read the result, which then cost next to nothing time the size of the result you want to read, irrespective of the size of the computation proper). Present day computers are obviously very far from this limit, but you can’t assume « computronium » is too.
Reading the results isn’t the only time you erase bits. Any time you use an “IF” statement, you have to either erase the branch that you don’t care about or double the size of your program in memory.
Any time you use an « IF » statement: 1) you’re not performing a reversible computation (e.g. your tech is not what minimise energy consumption); 2) the minimal cost is one bit, irrespective of the size of your program. Using MWI you could interpret this single bit as representing « half the branches », but not half the size in memory.
Non interaction is exactly what you want if shielding things from causal interaction let’s you slip down the entropy gradient slower, or something even more exotic.
How sure are we that dark matter isn’t computronium?
1:10^12 odds against the notion, easily. About as likely as the earth being flat.
Dark matter does not interact locally with itself or visible matter. If it did, it would experience friction (like interstellar gas, dust and stars) and form into disk shapes when spiral galaxies form into disk shapes. A key observation of dark matter is that spiral galaxies’ rotational velocity behaves as one would expect from an ellipsoid.
The fraction of matter that is dark does not change over time, nor does the total mass of objects in the universe. Sky surveys do not find more visible matter further back in time.
The fraction of matter that is dark does not change across space, even across distances that have not been bridgable since the inflation period of the big bang. All surveys show spherical symmetry.
By the laws of thermodynamics, computation requires work. High-entropy energy needs to be converted into low-entropy energy, such as heat. We do not see dark matter absorb or emit energy.
I can imagine no situation where something that is a required part of computational processes could ever present itself to us as dark matter, and no mistake in physics thorough enough to allow it.
How did you get this figure? Two one-in-a-million implausibilities?
Quantum computers are close to reversible. Each halo could be a big quantum coherent structure, with e.g. neutrinos as ancillary qubits. The baryonic world might be where the waste information gets dumped. :-)
Before learning about reversible computation only requiring work when bits are deleted I would have treated each of my points as roughly independent with about 10^1.5 , 10^4 , 10^4 , 10^2.5 odds against respectively. The last point is now down to 10^1.5 .
Dumping waste information in the baryonic world would be visible.
Not if the rate is low enough and/or astronomically localized enough.
It would be interesting to make a model in which fuzzy dark matter is coupled to neutrinos, in a way that maximizes rate of quantum information transfer, while remaining within empirical bounds.
Contra #1: Imagine you order a huge stack of computers for massive multiplayers game purpose. Would you expect it might collapse under it’s own weight, or would you expect the builders to be cautious enough that it won’t collapse like passive dust in free fall?
Contra #4: nope. Landauer’s principle implicates that reversible computation cost nothing (until you’d want to read the result, which then cost next to nothing time the size of the result you want to read, irrespective of the size of the computation proper). Present day computers are obviously very far from this limit, but you can’t assume « computronium » is too.
#2 and #3 sounds stronger, imo. Could you provide a glimpse of the confidence intervals and how it varies from one survey to the next?
#1 - Caution doesn’t solve problems, it finds solutions if they exist. You can’t use caution to ignore air resistance when building a rocket. (Though collapse is not necessarily expected—there’s plenty of interstellar dust).
#4 - I didn’t know about Landauer’s principle, though going by what I’m reading, you’re mistaken on its interpretation—it takes ‘next to nothing’ times the part of the computation you throw out, not the part you read out, where the part you throw out increases proportional to the negentropy you’re getting. No free lunch, still, but one whose price is deferable to the moment you run out of storage space.
That would make it possible for dark matter to be part of a computation that hasn’t been read out yet, though not necessarily a major part: I’m not sure the below reasoning is correct, but The Landauer limit with the current 2.7K universe as heat bath is 0.16 meV per bit. This means that the ‘free’ computational cycle you get from the fact that you only need to pay at the end would, to a maximally efficient builder, reward them with 0.16 meV extra for every piece of matter that can hold one bit. We don’t yet have a lower bound for the neutrino mass, but the upper bound is 120 meV. If the upper bound is true, that would mean you would have to cram 10^3 bits in a neutrino before using it as storage nets you more than burning it for energy (by chucking it into an evaporating black hole).
I don’t have data for #2 and #3 at hand. It’s the scientific consensus, for what that’s worth.
1-3: You are certainly right that cold and homogenous black matter is the scientific consensus right now (at least if by consensus we mean « most experts would either think that’s true or admit there is no data strong enough to convince most experts it’s wrong »).
The point I’m trying to make is: as soon as we say « computronium » we are outside of normal science. In normal science, you don’t suppose matter can choose to deploy itself like a solar sail and use that to progressively reach outside regions of the galaxy where dangerous SN are less frequent. You suppose if it exists it has no aim, then find the best non-weird model that fits the data.
In other words, I don’t think we can assume that the scientific consensus is automatically 10^4 or 10^8 strong evidence for « how sure are we that black matters is not a kind of matter that astrophysicist usually don’t botter to consider? », especially when the scientific consensus also includes « we need to keep spending ressources on figuring out what black matter is ». You do agree that’s also the scientific consensus, right? (And not just to keep labs open, but really to add data and visit and revisit new and old models because we’re still not sure what it is)
4: in the theory of purely reversible computation, the size of what you read dictates the size you must throw out. Your computation is however more sounded than the theory of pure reversible computation, because pure reversible computation may well be as impossible as perfectly analog computation. Now, suppose all black matters emits 0,16 mev/bit. How much computation per second and kilo would let the thermal radiation largely below our ability to detect it?
Reading the results isn’t the only time you erase bits. Any time you use an “IF” statement, you have to either erase the branch that you don’t care about or double the size of your program in memory.
Any time you use an « IF » statement: 1) you’re not performing a reversible computation (e.g. your tech is not what minimise energy consumption); 2) the minimal cost is one bit, irrespective of the size of your program. Using MWI you could interpret this single bit as representing « half the branches », but not half the size in memory.
Unless it’s computronium made of non-interacting matter, fairly. It’s not just distant galaxies, there’s plenty in the Milky Way too
Non interaction is exactly what you want if shielding things from causal interaction let’s you slip down the entropy gradient slower, or something even more exotic.