Similar ideas as Eliezer can occur to people without proper physics, experimental spirit or understanding of the brain (but I am not sure I can say “without rationality”, as the Art may not be what I think it to be). I mean,some Indian spiritual traditions have explicitly stated that although you feel and believe that you have a real self, although you feel your existence as an entity strongly, this is not acceptable evidence for the existence of your “self”. This is their key to selflessness.
In other words, you may feel your existence outside of physics or whatever reality you believe in, and yet you should not trust this feeling. This sounds rational to me, but is further complicated by the fact that their tenets call for the abandonment of self, and thus the conclusion was not drawn on a fair ground. Also, the follow-up question of life-choices and meaning is dissolved by obligations that mainly consists of living an intellectual life as prescribed.
I do not recommend reading this kind of material, it can hurt. I’m just making a point, that even without a scientific method, even while thinking your attitudes can control your afterlife, you can start having these meta thoughts and actually be somewhat right. Maybe this fact is relevent to, um, AI theory?
Similar ideas as Eliezer can occur to people without proper physics, experimental spirit or understanding of the brain (but I am not sure I can say “without rationality”, as the Art may not be what I think it to be). I mean,some Indian spiritual traditions have explicitly stated that although you feel and believe that you have a real self, although you feel your existence as an entity strongly, this is not acceptable evidence for the existence of your “self”. This is their key to selflessness. In other words, you may feel your existence outside of physics or whatever reality you believe in, and yet you should not trust this feeling. This sounds rational to me, but is further complicated by the fact that their tenets call for the abandonment of self, and thus the conclusion was not drawn on a fair ground. Also, the follow-up question of life-choices and meaning is dissolved by obligations that mainly consists of living an intellectual life as prescribed. I do not recommend reading this kind of material, it can hurt. I’m just making a point, that even without a scientific method, even while thinking your attitudes can control your afterlife, you can start having these meta thoughts and actually be somewhat right. Maybe this fact is relevent to, um, AI theory?