but you are not giving us any evidence, or even an argument, that dust doesn’t generate observers with only partially ordered experiences.
‘Partly ordered experiences’ require a great deal of functioning order. Even gross hallucinations involve highly coordinated systems interacting in a dysfunctional but structured way. Genuinely disordering the underlying systems doesn’t result in chaotic experiences, it results in no experiences.
People who have been in accidents, especially those that are physically shocking (like car accidents that involve sudden, violent stops) tend to lose all recollection not only of the moments right after the blow, but the time before it. The data is lost without being processed or moved to long-term storage. And there are no ‘experiences’ associated with them. It doesn’t take very much disorder to disrupt the processes of conscious experience. And then there’s no observer there! A great deal of continuity is required for a conscious observer to exist along a particular axis of time. It takes very little to violate that continuity.
The complexity of a system necessary to have experiences is far, far greater than complexity (or lack thereof) of the data the system can pick up on in any brief period of time. If we’re going to consider a system that’s partly ordered and partly disordered, the vast majority of them won’t have functional minds perceiving disorderly inputs—there are far more ways the mind can be improperly constructed than the incoming sense data can be scrambled.
And that’s not even addressing the deeper issue that algorithms do not somehow magically perceive the systems implementing them. I keep hearing the objection that dust-borne consciousnesses that might arise within high-entropy systems shouldn’t perceive orderly surroundings, but that’s just silly. The perceived surroundings have absolutely nothing to do with the environment the algorithm is being emulated within, and absolutely everything to do with the internal states of the algorithm. An algorithm that wasn’t processing highly orderly internal states wouldn’t fit the criteria necessary for it to qualify as an observer.
The natural-language arguments being presented have the same flaws, over and over. The claimed conclusions simply do not follow from the specified premises.
People who have been in accidents, especially those that are physically shocking (like car accidents that involve sudden, violent stops) tend to lose all recollection not only of the moments right after the blow, but the time before it. The data is lost without being processed or moved to long-term storage. And there are no ‘experiences’ associated with them. It doesn’t take very much disorder to disrupt the processes of conscious experience. And then there’s no observer there! A great deal of continuity is required for a conscious observer to exist along a particular axis of time. It takes very little to violate that continuity.
The complexity of a system necessary to have experiences is far, far greater than complexity (or lack thereof) of the data the system can pick up on in any brief period of time. If we’re going to consider a system that’s partly ordered and partly disordered, the vast majority of them won’t have functional minds perceiving disorderly inputs—there are far more ways the mind can be improperly constructed than the incoming sense data can be scrambled.
And that’s not even addressing the deeper issue that algorithms do not somehow magically perceive the systems implementing them. I keep hearing the objection that dust-borne consciousnesses that might arise within high-entropy systems shouldn’t perceive orderly surroundings, but that’s just silly. The perceived surroundings have absolutely nothing to do with the environment the algorithm is being emulated within, and absolutely everything to do with the internal states of the algorithm. An algorithm that wasn’t processing highly orderly internal states wouldn’t fit the criteria necessary for it to qualify as an observer.
The natural-language arguments being presented have the same flaws, over and over. The claimed conclusions simply do not follow from the specified premises.