Here, let me go back in time and become timtyler and retroactively write the following instead, thereby averting this whole discussion:
Calculating what would happen if an agent took some action is not counter-factual from the POV of the agent. The agent doesn’t know what action it is going to take. If it did, it would just wait for itself to take the action—not spend time calculating the consequences of its various possible actions.
Here, let me go back in time and become timtyler and retroactively write the following instead, thereby averting this whole discussion: