My $0.02: all it takes is a system a) without access to its own logs, and b) disposed to posit, for any event E for which a causal story isn’t readily available, a default causal story in which some agent deliberately caused E to advance some goal.
Given those two things, it will posit for its own actions a causal story in which it is the agent, since it’s the capable-of-agency thing most tightly associated with its actions.
Note that this does not require there not be free will (whatever that even means, assuming it means anything), it merely asserts that whether there is or not (or a third alternative, if there is one), the system will classify its actions as its own doing unless it has some specific reason to otherwise classify them.
My $0.02: all it takes is a system a) without access to its own logs, and b) disposed to posit, for any event E for which a causal story isn’t readily available, a default causal story in which some agent deliberately caused E to advance some goal.
Given those two things, it will posit for its own actions a causal story in which it is the agent, since it’s the capable-of-agency thing most tightly associated with its actions.
Note that this does not require there not be free will (whatever that even means, assuming it means anything), it merely asserts that whether there is or not (or a third alternative, if there is one), the system will classify its actions as its own doing unless it has some specific reason to otherwise classify them.