Your A’ is equivalent to my A, because it ends up optimizing for 1-day expected return, no matter what environment it’s in.
My A’ is not necessarily reasoning in terms of “cooperating with my future self”, that’s just how it acts!
(You could implement my A’ by such reasoning if you want. The cooperation is irrational in CDT, for the reasons you point out. But it’s rational in some of the acausal decision theories.)
Your A’ is equivalent to my A, because it ends up optimizing for 1-day expected return, no matter what environment it’s in.
My A’ is not necessarily reasoning in terms of “cooperating with my future self”, that’s just how it acts!
(You could implement my A’ by such reasoning if you want. The cooperation is irrational in CDT, for the reasons you point out. But it’s rational in some of the acausal decision theories.)