Does Tegmark provide any justification for the lower weight thing or is it a flat out “it could work if in some sense higher complexity realities have lower weight”?
For that matter, what would it even mean for them to be lower weight?
I’d, frankly, expect the reverse. The more “tunable parameters”, the more patterns of values they could take on, so...
For that matter, if some means of different weights/measures/whatever could be applied to the different algorithm’s, why disallow that sort of thing being applied to different “dust interpretations”?
And any thoughts at all on why it seems like I’m not (at least, most of me seemingly isn’t) a Boltzmann brain?
Does Tegmark provide any justification for the lower weight thing or is it a flat out “it could work if in some sense higher complexity realities have lower weight”?
For that matter, what would it even mean for them to be lower weight?
I’d, frankly, expect the reverse. The more “tunable parameters”, the more patterns of values they could take on, so...
For that matter, if some means of different weights/measures/whatever could be applied to the different algorithm’s, why disallow that sort of thing being applied to different “dust interpretations”?
And any thoughts at all on why it seems like I’m not (at least, most of me seemingly isn’t) a Boltzmann brain?