As I understand it, Gurdieff and such are claiming it’s possible to have sufficiently reliable knowledge such that basing action on anything else obviously isn’t attractive.
That kind of certainty does exist in some realms—if someone claims to have trisected the angle or built a perpetual motion machine, you can be sure there’s a mistake or fraud somewhere, and you also aren’t going to spend your time trying to achieve those projects yourself.
Whether such knowledge is possible for more complex situations isn’t obvious, but I do think that’s where he’s pointing.
Reading the quote and your explanation, I thought of this:
Through my mind flashed the passage:
“Do nothing because it is righteous, or praiseworthy, or noble, to do so; do nothing because it seems good to do so; do only that which you must do, and which you cannot do in any other way.”
Doing what it seemed good to do, had only led me astray.
So I called a full stop.
And I decided that, from then on, I would follow the strategy that could have saved me if I had followed it years ago: Hold my FAI designs to the higher standard of not doing that which seemed like a good idea, but only that which I understood on a sufficiently deep level to see that I could not do it in any other way.
“Do nothing because it is righteous, or praiseworthy, or noble, to do so; do nothing because it seems good to do so; do only that which you must do, and which you cannot do in any other way.”
If I took that advice literally, I wouldn’t do much of anything at all.
As I understand it, Gurdieff and such are claiming it’s possible to have sufficiently reliable knowledge such that basing action on anything else obviously isn’t attractive.
That kind of certainty does exist in some realms—if someone claims to have trisected the angle or built a perpetual motion machine, you can be sure there’s a mistake or fraud somewhere, and you also aren’t going to spend your time trying to achieve those projects yourself.
Whether such knowledge is possible for more complex situations isn’t obvious, but I do think that’s where he’s pointing.
Reading the quote and your explanation, I thought of this:
-- My Bayesian Enlightenment
If I took that advice literally, I wouldn’t do much of anything at all.
I’m resisting googling this… Ursula K. Le Guin, right? Though it sounds like something out of the Dhammapada.
Yes, the Farthest Shore. here or here