Well, it might be that such observers are less ‘dense’ than ones in a stable universe
In that case most of your measure is in stable universes and dust theory isn’t anything to worry about.
But that can’t be the case, as isn’t the whole point of dust theory that basically any set of relations can be construed as a computation implementing your subjective experience, and this experience is self-justifying? If that’s the case the majority of your measure must be dust.
Dust theory has a weird pulled-up-by-your-own bootstraps taste to it and I have a strong aversion to regarding it as true, but Egan’s argument against it is the best I can find and it’s not entirely satisfying but should be sufficiently comforting to allow you to sleep.
In that case most of your measure is in stable universes and dust theory isn’t anything to worry about._
There are different ways of defining ,measure. DT guarantees that lack of continuity, and therefore low density, won’t be subjectivtly noticeable....at least, it will look like chaotic observations , not feral like “I’m dead”
Dust theory has a weird pulled-up-by-your-own bootstraps taste to it and I have a strong aversion to regarding it as true, but Egan’s argument against it is the best I can find and it’s not entirely satisfying but should be sufficiently comforting to allow you to sleep.
Maybe you could include:
construed as a computation BY WHOM?
Computation is a process, and not any process, so the idea of an instantaneous computational state.
(There is a possible false dichotomy there: consciousness isnt the output of a computation that takes a lifetime to perform, but there could be still be millions of computatioNs required to generate a “specious present”)
But that can’t be the case, as isn’t the whole point of dust theory that basically any set of relations can be construed as a computation implementing your subjective experience, and this experience is self-justifying?
Not necessarily to you. It doesn’t have to make much sense to you at all. But our observations are orderly, and that is something that can’t be explained by the majority of our measure being dust. Why would it default to this?
If you make Egan’s assumption, I think it is an extremely strong argument.
I don’t reject it, I simply think that Dust Theory based on this assumption is so unlikely that we may as well assume the opposite- that different patterns can be more common; have more measure, than others.
That you find yourself randomly selected from a pool of all conceivable observers, rather than a pool with probabilities assigned to them.
EDIT: Actually, the former option is flatly impossible, because my mindstate would jump to any conceivable one that could be generated from it. I would have an infinitesimal chance of becoming coherent enough to have anything resembling a ‘thought.’
Then why would you begin to suspect that the pool of observers does not coincide with the set of minds that have a physical instantiation and dynamics? If there’s a nontrivial probability distribution, then there’s going to be SOME sort of rules involved, and physics gives us a really solid candidate for what those rules might be.
In that case most of your measure is in stable universes and dust theory isn’t anything to worry about.
But that can’t be the case, as isn’t the whole point of dust theory that basically any set of relations can be construed as a computation implementing your subjective experience, and this experience is self-justifying? If that’s the case the majority of your measure must be dust.
Dust theory has a weird pulled-up-by-your-own bootstraps taste to it and I have a strong aversion to regarding it as true, but Egan’s argument against it is the best I can find and it’s not entirely satisfying but should be sufficiently comforting to allow you to sleep.
There are different ways of defining ,measure. DT guarantees that lack of continuity, and therefore low density, won’t be subjectivtly noticeable....at least, it will look like chaotic observations , not feral like “I’m dead”
Maybe you could include:
construed as a computation BY WHOM?
Computation is a process, and not any process, so the idea of an instantaneous computational state.
(There is a possible false dichotomy there: consciousness isnt the output of a computation that takes a lifetime to perform, but there could be still be millions of computatioNs required to generate a “specious present”)
Not necessarily to you. It doesn’t have to make much sense to you at all. But our observations are orderly, and that is something that can’t be explained by the majority of our measure being dust. Why would it default to this?
If you make Egan’s assumption, I think it is an extremely strong argument.
Why don’t you buy it?
I don’t reject it, I simply think that Dust Theory based on this assumption is so unlikely that we may as well assume the opposite- that different patterns can be more common; have more measure, than others.
I’m confused. What were you referring to when you said, “on this assumption”?
That you find yourself randomly selected from a pool of all conceivable observers, rather than a pool with probabilities assigned to them.
EDIT: Actually, the former option is flatly impossible, because my mindstate would jump to any conceivable one that could be generated from it. I would have an infinitesimal chance of becoming coherent enough to have anything resembling a ‘thought.’
Then why would you begin to suspect that the pool of observers does not coincide with the set of minds that have a physical instantiation and dynamics? If there’s a nontrivial probability distribution, then there’s going to be SOME sort of rules involved, and physics gives us a really solid candidate for what those rules might be.
What exactly does this mean? All minds are going to find some ‘justification’ as to why they exist.
Well, they ,might, if they were coherent emough, transtemporally, to even have anything resembling a thought. But why would that be the case?