Well, I for one am confused much of the time, and whenever I encounter someone who ostensibly isn’t, I get nervous. Believing falsehoods isn’t just the domain of dark artisans, it comes courtesy of having a brain. My belief that “most of my beliefs must have large error bounds” probably has among my lowest error bounds; I’m surest about being unsure.
I do wonder if convincing oneself of having given up deluding oneself isn’t the greatest dark side achievement of all—after all, how would you know it’s not? Maybe you got tricked by your System 0.
But I’m being contrarian. Good post overall. I guess the metric I’d prefer in terms of belief improvement is “number of times I’ve noticed my confusion and bent myself to accept what I perceive, instead of bending my perceptions to myself”. More of an engineering approach, still allowing for a few holy cows that don’t get slaughtered (without overly compromising your overall strength as a rationalist).
Well, I for one am confused much of the time, and whenever I encounter someone who ostensibly isn’t, I get nervous. Believing falsehoods isn’t just the domain of dark artisans, it comes courtesy of having a brain. My belief that “most of my beliefs must have large error bounds” probably has among my lowest error bounds; I’m surest about being unsure.
I do wonder if convincing oneself of having given up deluding oneself isn’t the greatest dark side achievement of all—after all, how would you know it’s not? Maybe you got tricked by your System 0.
But I’m being contrarian. Good post overall. I guess the metric I’d prefer in terms of belief improvement is “number of times I’ve noticed my confusion and bent myself to accept what I perceive, instead of bending my perceptions to myself”. More of an engineering approach, still allowing for a few holy cows that don’t get slaughtered (without overly compromising your overall strength as a rationalist).