That’s the idea I try to address in my above comment. In our universe, if the same pattern appears twice, then presumably those two appearances have a common cause. But if our universe was just the time-reversed version of an entropy-decreasing universe, it doesn’t seem to me like this would need to be the case
When you run dynamics ordinarily in a low-entropy environment, it’s perfectly possible for some pattern to generate two copies of itself, which then disintegrate: X → X X → ? ?
If you then flip the direction of time, you get ? ? → X X → X; that is, X appearing twice independently and then merging.
I don’t think there’s anything about the “bias towards lower entropy” law that would prevent the X → X X → ? ? pattern (though maybe I’m missing something), so if you reverse that, you’d get a lot of ? ? → X X → X patterns.
Which measure?
Whichever measure you use to quantify entropy.
If you are not measure-preserving under any measure, then you can simply destroy information to get rid of entropy. E.g. the function f(x) = 0 destroys all information, getting rid of all entropy.
I don’t see why.
Which measure?
When you run dynamics ordinarily in a low-entropy environment, it’s perfectly possible for some pattern to generate two copies of itself, which then disintegrate: X → X X → ? ?
If you then flip the direction of time, you get ? ? → X X → X; that is, X appearing twice independently and then merging.
I don’t think there’s anything about the “bias towards lower entropy” law that would prevent the X → X X → ? ? pattern (though maybe I’m missing something), so if you reverse that, you’d get a lot of ? ? → X X → X patterns.
Whichever measure you use to quantify entropy.
If you are not measure-preserving under any measure, then you can simply destroy information to get rid of entropy. E.g. the function f(x) = 0 destroys all information, getting rid of all entropy.