But even then we must remember that “simply turning itself off” is not a neutral act. Turning itself off does change things.
The null action was defined as a case where the AI outputs NULL. (Where a random event transforms the AI’s output to NULL, actually.) So if the AI outputs NULL, we know what happened and will act accordingly, but the AI doesn’t get penalized because (provided we incinerated all traces of the AI’s reasoning) this is the same thing that we would have done if the AI’s output had been randomly transformed into NULL.
Also, note that the proposal involved coarse graining. We can (attempt to) adopt a coarse graining that ignores all of our reactions to the AI’s output.
The null action was defined as a case where the AI outputs NULL. (Where a random event transforms the AI’s output to NULL, actually.) So if the AI outputs NULL, we know what happened and will act accordingly, but the AI doesn’t get penalized because (provided we incinerated all traces of the AI’s reasoning) this is the same thing that we would have done if the AI’s output had been randomly transformed into NULL.
Also, note that the proposal involved coarse graining. We can (attempt to) adopt a coarse graining that ignores all of our reactions to the AI’s output.