I think I’d have to say still “Nothing of significance happened until memory access occurs”
Until then, well, how’s it any different then stealing your books… and then replacing them before you notice?
Now, as I said, we probably ought be asking questions like “what if in the actual “conscious processing” part of the agent, a few bits were changed in one instance… but just that… so initially, before it propagates enough to completely diverge… what should we say? To say it completely changes everything instantly, well… that seems too much like saying “conscious agents are irreducible”, so...
(just to clarify: I’m more laying out a bit of my confusion here rather than anything else, plus noting that we seem to have been, in our quest to find reductions for aspects of consciousness, implicitly treating agents as irreducible in certain ways)
(just to clarify: I’m more laying out a bit of my confusion here rather than anything else, plus noting that we seem to have been, in our quest to find reductions for aspects of consciousness, implicitly treating agents as irreducible in certain ways)
Indeed. It’s not obvious what we can reduce agents down further into without losing agents entirely; bit-for-bit identity is at least clear in a few situations.
(To continue the example—if we see the unaccessed memory as being part of the agent, then obviously we can’t mess with it without changing the agent; but if we intuitively see it as like the agent having Internet access and the memory being a webpage, then we wouldn’t regard it as part of its identity.)
And if we restore a different long-term memory instead?
I think I’d have to say still “Nothing of significance happened until memory access occurs”
Until then, well, how’s it any different then stealing your books… and then replacing them before you notice?
Now, as I said, we probably ought be asking questions like “what if in the actual “conscious processing” part of the agent, a few bits were changed in one instance… but just that… so initially, before it propagates enough to completely diverge… what should we say? To say it completely changes everything instantly, well… that seems too much like saying “conscious agents are irreducible”, so...
(just to clarify: I’m more laying out a bit of my confusion here rather than anything else, plus noting that we seem to have been, in our quest to find reductions for aspects of consciousness, implicitly treating agents as irreducible in certain ways)
Indeed. It’s not obvious what we can reduce agents down further into without losing agents entirely; bit-for-bit identity is at least clear in a few situations.
(To continue the example—if we see the unaccessed memory as being part of the agent, then obviously we can’t mess with it without changing the agent; but if we intuitively see it as like the agent having Internet access and the memory being a webpage, then we wouldn’t regard it as part of its identity.)