Ordinarily, yes, but you could imagine scenarios where agents have the option to erase their own memories or essentially commit group suicide. (I don’t believe these kinds of scenarios are extreme beyond belief—they could come up in transhuman contexts.) In this case nobody even remembers which action you chose, so there is no extrinsic motivation for signalling.
Ordinarily, yes, but you could imagine scenarios where agents have the option to erase their own memories or essentially commit group suicide. (I don’t believe these kinds of scenarios are extreme beyond belief—they could come up in transhuman contexts.) In this case nobody even remembers which action you chose, so there is no extrinsic motivation for signalling.