In the strictest sense, yes I am. I design, build and test social models for a living (so this may simply be a case of me holding Maslow’s Hammer). The universe exhibits a number of physical properties which resemble modeling assumptions. For example, speed is absolutely bounded at c. If I were designing an actual universe (not a model), I wouldn’t enforce upper bounds—what purpose would they serve? If I were designing a model, however, boundaries of this sort would be critical to reducing the complexity of the model universe to the realm of tractable computability.
On any given day, I’ll instantiate thousands of models. Having many models running in parallel is useful! We observe one universe, but if there’s a non-zero probability that the universe is a model of something else (a possibility which Ockham’s Razor certainly doesn’t refute), the fact that I generate so many models is indicative of the possibility that a super-universal process or entity may be doing the same thing, of which our universe is one instance.
I do think its useful to use what we know about simulations to inform whether or not we live in one. As I said in my other comment, I don’t think a finite speed of light, etc., says much either way, but I do want to note a few things that I think would be suggestive.
If time was discrete and the time step appeared to be a function of known time step limits (e.g., the CFL condition), I would consider that to be good evidence in favor of the simulation hypothesis.
The jury is still out whether time is discrete, so we can’t evaluate the second necessary condition. If time were discrete, this would be interesting and could be evidence for the simulation hypothesis, but it’d be pretty weak. You’d need something further that indicates something how the algorithm, like the time step limit, to make a stronger conclusion.
Another possibility is if some conservation principle were violated in a way that would reduce computational complexity. In the water sprinkler simulations I’ve run, droplets are removed from the simulation when their size drops below a certain (arbitrary) limit as these droplets have little impact on the physics, and mostly serve to slow down the computation. Strictly speaking, this violates conservation of mass. I haven’t seen anything like this in physics, but its existence could be evidence for the simulation hypothesis.
For example, speed is absolutely bounded at c. If I were designing an actual universe (not a model), I wouldn’t enforce upper bounds—what purpose would they serve? If I were designing a model, however, boundaries of this sort would be critical to reducing the complexity of the model universe to the realm of tractable computability.
This is not true in general. I’ve considered a similar idea before, but as a reason to believe we don’t live in a simulation (not that I think this is a very convincing argument). I work in computational fluid dynamics. “Low-Mach”/incompressible fluid simulations where the speed of sound is assumed infinite are much more easily tractable than the same situation run on a “high Mach” code, even if the actual fluid speeds are very subsonic. The difference of running time is at least an order of magnitude.
To be fair, it can go either way. The speed of the fluid is not “absolutely bounded” in these simulations. These simulations are not relativistic, and treating them as that would make things more complicated. The speed of acoustic waves, however, is treated infinite in the low Mach limit. I imagine there are situations in other branches of mathematical physics where treating a speed as infinite (as in the case of acoustic waves) or zero (as in the non-relativistic case) simplifies certain situations. In the end, it seems like a wash to me, and this offers little evidence in favor or against the simulation hypothesis.
Huh. It never occurred to me that imposing finite bounds might increase the complexity of a simulation, but I can see how that could be true for physical models. Is the assumption you’re making in the Low Mach/incompressible fluid models that the speed of sound is explicitly infinite, or is it that the speed of sound lacks an upper bound? (i.e., is there a point in the code where you have to declare something like “sound.speed = infinity”?)
Anyway, I’ve certainly never encountered any such situation in models of social systems. I’ll keep an eye out for it now. Thanks for sharing!
Not true: it means you shouldn’t use a normal distribution, and when you do you should say so up front. I see no reason not to apply normal distributions if your limit is high (say, greater than 4 sigmas—social science is much fuzzier than physical science). Better yet, make your limit a function of the number of observations you have. As the probability of getting into the long tail gets higher, make the tail longer.
Sentence 1: True, fair point.
Sentence 2: This isn’t obvious to me. Selecting random values from a truncated normal distribution is (slightly) more complex than, say, a uniform distribution over the same range, but it is demonstrably (slightly) less complex than selecting random values from an unbounded normal distribution. Without finite boundaries, you’d need infinite precision arithmetic just to draw a value.
The problem is not with value selection, the problem is with model manipulation. The normal distribution is very well-studied, it has a number of appealing properties which make working with it rather convenient, there is a lot of code written to work with it, etc. Replace it with a truncated normal and suddenly a lot of things break.
Glad you found my post interesting. I found yours interesting as well, as I thought I was the only one who made any argument along those lines.
There’s no explicit step where you say the speed of sound is infinite. That’s just the net effect of how you model the pressure field. In reality, the pressure comes from thermodynamics at some level. In the low-Mach/incompressible model, the pressure only exists to enforce mass conservation, and in some sense is “junk” (though still compares favorably against exact solutions). Basically, you do some math to decouple the thermodynamic and “fluctuating” pressure (this is really the only change; the remainder are implications of the change). You end up with a Poisson equation for (“fluctuating”) pressure, and this equation lacks the ability to take into account finite pressure/acoustic wave speeds. The wave speed is effectively infinite.
To be honest, I need to read papers like this to gain a fuller appreciation of all the implications of this approximation. But what I describe is accurate if lacking in some of the details.
In some ways, this does make things more complicated (pressure boundary conditions being one area). But in terms of speed, it’s a huge benefit.
Here’s another example from my field: thermal radiation modeling. If you use ray tracing (like 3D rendering) then it’s often practical to assume that the speed of light is infinite, because it basically is relative to the other processes you are looking at. The “speed” of heat conduction, for example, is much slower. If you used a finite wave speed for the rays then things would be much slower.
That makes a lot of sense. I asked about explicit declaration versus implicit assumption because assumptions of this sort do exist in social models. They’re just treated as unmodeled characteristics either of agents or of reality. We can make these assumptions because they either don’t inform the phenomenon we’re investigating (e.g. infinite ammunition can be implicitly assumed in an agent-based model of battlefield medic behavior because we’re not interested in the draw-down or conclusion of the battle in the absence of a decisive victory) or the model’s purpose is to investigate relationships within a plausible range (which sounds like your use case). That said, I’m very curious about the existence of models for which explicitly setting a boundary of infinity can reduce computational complexity. It seems like such a thing is either provably possible or (more likely) provably impossible. Know of anything like that?
I see your distinction now. That is a good classification.
To go back to the low-Mach/incompressible flow model, I have seen series expansions in terms of the Mach number applied to (subsets of) the fluid flow equations, and the low-Mach approximation is found by setting the Mach number to zero. (Ma = v / c, so if c, the speed of sound, approaches infinity, then Ma goes to 0.) So it seems that you can go the other direction to derive equations starting with the goal of modeling a low-Mach flow, but that’s not typically what I see. There’s no “Mach number dial” in the original equations, so you basically have to modify the equations in some way to see what changes as the Mach number goes to zero.
For this entire class of problems, even if there were a “Mach number dial”, you wouldn’t recover the nice mathematical features you want for speed by setting the Mach number to zero in a code that can handle high Mach physics. So, for fluid flow simulations, I don’t think an explicit declaration of infinite sound speed reducing computational time is possible.
From the perspective of someone in a fluid-flow simulation (if such a thing is possible), however, I don’t think the explicit-implicit classification matters. For all someone inside the simulation knows, the model (their “reality”) explicitly uses an infinite acoustic wave speed. This person might falsely conclude that they don’t live in a simulation because their speed of sound appears to be infinite.
It seems like such a thing is either provably possible or (more likely) provably impossible. Know of anything like that?
Btrettel’s example of ray tracing in thermal radiation is such a model. Another example from social science: basic economic and game theory often assume the agents are omniscient or nearly omniscient.
False: Assuming something is infinite (unbounded) is not the same as coercing it to a representation of infinity. Neither of those examples when represented in code would require a declaration that thing=infinity. That aside, game theory often assumes players have unbounded computational resources and a perfect understanding of the game, but never omniscience.
Why? This looks as if you’re taking a hammer to Ockham’s razor.
In the strictest sense, yes I am. I design, build and test social models for a living (so this may simply be a case of me holding Maslow’s Hammer). The universe exhibits a number of physical properties which resemble modeling assumptions. For example, speed is absolutely bounded at c. If I were designing an actual universe (not a model), I wouldn’t enforce upper bounds—what purpose would they serve? If I were designing a model, however, boundaries of this sort would be critical to reducing the complexity of the model universe to the realm of tractable computability.
On any given day, I’ll instantiate thousands of models. Having many models running in parallel is useful! We observe one universe, but if there’s a non-zero probability that the universe is a model of something else (a possibility which Ockham’s Razor certainly doesn’t refute), the fact that I generate so many models is indicative of the possibility that a super-universal process or entity may be doing the same thing, of which our universe is one instance.
I do think its useful to use what we know about simulations to inform whether or not we live in one. As I said in my other comment, I don’t think a finite speed of light, etc., says much either way, but I do want to note a few things that I think would be suggestive.
If time was discrete and the time step appeared to be a function of known time step limits (e.g., the CFL condition), I would consider that to be good evidence in favor of the simulation hypothesis.
The jury is still out whether time is discrete, so we can’t evaluate the second necessary condition. If time were discrete, this would be interesting and could be evidence for the simulation hypothesis, but it’d be pretty weak. You’d need something further that indicates something how the algorithm, like the time step limit, to make a stronger conclusion.
Another possibility is if some conservation principle were violated in a way that would reduce computational complexity. In the water sprinkler simulations I’ve run, droplets are removed from the simulation when their size drops below a certain (arbitrary) limit as these droplets have little impact on the physics, and mostly serve to slow down the computation. Strictly speaking, this violates conservation of mass. I haven’t seen anything like this in physics, but its existence could be evidence for the simulation hypothesis.
This is not true in general. I’ve considered a similar idea before, but as a reason to believe we don’t live in a simulation (not that I think this is a very convincing argument). I work in computational fluid dynamics. “Low-Mach”/incompressible fluid simulations where the speed of sound is assumed infinite are much more easily tractable than the same situation run on a “high Mach” code, even if the actual fluid speeds are very subsonic. The difference of running time is at least an order of magnitude.
To be fair, it can go either way. The speed of the fluid is not “absolutely bounded” in these simulations. These simulations are not relativistic, and treating them as that would make things more complicated. The speed of acoustic waves, however, is treated infinite in the low Mach limit. I imagine there are situations in other branches of mathematical physics where treating a speed as infinite (as in the case of acoustic waves) or zero (as in the non-relativistic case) simplifies certain situations. In the end, it seems like a wash to me, and this offers little evidence in favor or against the simulation hypothesis.
Huh. It never occurred to me that imposing finite bounds might increase the complexity of a simulation, but I can see how that could be true for physical models. Is the assumption you’re making in the Low Mach/incompressible fluid models that the speed of sound is explicitly infinite, or is it that the speed of sound lacks an upper bound? (i.e., is there a point in the code where you have to declare something like “sound.speed = infinity”?)
Anyway, I’ve certainly never encountered any such situation in models of social systems. I’ll keep an eye out for it now. Thanks for sharing!
As a trivial point, imposing finite bounds means that you can’t use the normal distribution, for example :-)
Not true: it means you shouldn’t use a normal distribution, and when you do you should say so up front. I see no reason not to apply normal distributions if your limit is high (say, greater than 4 sigmas—social science is much fuzzier than physical science). Better yet, make your limit a function of the number of observations you have. As the probability of getting into the long tail gets higher, make the tail longer.
Truncated normal is not the same thing as a plain-vanilla normal. And using it does mean increasing the complexity of the simulation.
Sentence 1: True, fair point. Sentence 2: This isn’t obvious to me. Selecting random values from a truncated normal distribution is (slightly) more complex than, say, a uniform distribution over the same range, but it is demonstrably (slightly) less complex than selecting random values from an unbounded normal distribution. Without finite boundaries, you’d need infinite precision arithmetic just to draw a value.
The problem is not with value selection, the problem is with model manipulation. The normal distribution is very well-studied, it has a number of appealing properties which make working with it rather convenient, there is a lot of code written to work with it, etc. Replace it with a truncated normal and suddenly a lot of things break.
Oh! I see what you’re saying. Definitely can’t argue with that.
Glad you found my post interesting. I found yours interesting as well, as I thought I was the only one who made any argument along those lines.
There’s no explicit step where you say the speed of sound is infinite. That’s just the net effect of how you model the pressure field. In reality, the pressure comes from thermodynamics at some level. In the low-Mach/incompressible model, the pressure only exists to enforce mass conservation, and in some sense is “junk” (though still compares favorably against exact solutions). Basically, you do some math to decouple the thermodynamic and “fluctuating” pressure (this is really the only change; the remainder are implications of the change). You end up with a Poisson equation for (“fluctuating”) pressure, and this equation lacks the ability to take into account finite pressure/acoustic wave speeds. The wave speed is effectively infinite.
To be honest, I need to read papers like this to gain a fuller appreciation of all the implications of this approximation. But what I describe is accurate if lacking in some of the details.
In some ways, this does make things more complicated (pressure boundary conditions being one area). But in terms of speed, it’s a huge benefit.
Here’s another example from my field: thermal radiation modeling. If you use ray tracing (like 3D rendering) then it’s often practical to assume that the speed of light is infinite, because it basically is relative to the other processes you are looking at. The “speed” of heat conduction, for example, is much slower. If you used a finite wave speed for the rays then things would be much slower.
That makes a lot of sense. I asked about explicit declaration versus implicit assumption because assumptions of this sort do exist in social models. They’re just treated as unmodeled characteristics either of agents or of reality. We can make these assumptions because they either don’t inform the phenomenon we’re investigating (e.g. infinite ammunition can be implicitly assumed in an agent-based model of battlefield medic behavior because we’re not interested in the draw-down or conclusion of the battle in the absence of a decisive victory) or the model’s purpose is to investigate relationships within a plausible range (which sounds like your use case). That said, I’m very curious about the existence of models for which explicitly setting a boundary of infinity can reduce computational complexity. It seems like such a thing is either provably possible or (more likely) provably impossible. Know of anything like that?
I see your distinction now. That is a good classification.
To go back to the low-Mach/incompressible flow model, I have seen series expansions in terms of the Mach number applied to (subsets of) the fluid flow equations, and the low-Mach approximation is found by setting the Mach number to zero. (Ma = v / c, so if c, the speed of sound, approaches infinity, then Ma goes to 0.) So it seems that you can go the other direction to derive equations starting with the goal of modeling a low-Mach flow, but that’s not typically what I see. There’s no “Mach number dial” in the original equations, so you basically have to modify the equations in some way to see what changes as the Mach number goes to zero.
For this entire class of problems, even if there were a “Mach number dial”, you wouldn’t recover the nice mathematical features you want for speed by setting the Mach number to zero in a code that can handle high Mach physics. So, for fluid flow simulations, I don’t think an explicit declaration of infinite sound speed reducing computational time is possible.
From the perspective of someone in a fluid-flow simulation (if such a thing is possible), however, I don’t think the explicit-implicit classification matters. For all someone inside the simulation knows, the model (their “reality”) explicitly uses an infinite acoustic wave speed. This person might falsely conclude that they don’t live in a simulation because their speed of sound appears to be infinite.
Btrettel’s example of ray tracing in thermal radiation is such a model. Another example from social science: basic economic and game theory often assume the agents are omniscient or nearly omniscient.
False: Assuming something is infinite (unbounded) is not the same as coercing it to a representation of infinity. Neither of those examples when represented in code would require a declaration that thing=infinity. That aside, game theory often assumes players have unbounded computational resources and a perfect understanding of the game, but never omniscience.
A better term is “logical omniscience”.