I don’t remember the experience, but it sounds like a collection of absent-minded system 1 responses that build on each other, there doesn’t appear to be a preferred direction to them. This is also the characterization from the comment itself:
My mind confused this single thing for the light turning off, and then produced a whole sequence of complex thoughts around this single confusion, all the way relying on this fact being true.
As I understand, “rationalization” refers to something like optimization of thoughts in the direction of a preferred conclusion, not to any kind of thinking under a misconception. If I believe something wrong, of course I’ll be building on the wrong thing and making further wrong conclusions, until I notice that it’s wrong.
I don’t remember the experience, but it sounds like a collection of absent-minded system 1 responses that build on each other, there doesn’t appear to be a preferred direction to them. This is also the characterization from the comment itself:
As I understand, “rationalization” refers to something like optimization of thoughts in the direction of a preferred conclusion, not to any kind of thinking under a misconception. If I believe something wrong, of course I’ll be building on the wrong thing and making further wrong conclusions, until I notice that it’s wrong.