that you should be able to see through if you really understood “how an algorithm feels from the inside”, or the rest of the mind projection fallacy sequence.
Upvoted for this in particular. There is one possible interpretation that is something of a status hit, but on LessWrong, this should be considered as a statement of the form “you making this statement is strong evidence you do not understand something. If you believe you do understand that something, now is the time to notice you are confused!”
Upvoted for this in particular. There is one possible interpretation that is something of a status hit, but on LessWrong, this should be considered as a statement of the form “you making this statement is strong evidence you do not understand something. If you believe you do understand that something, now is the time to notice you are confused!”
It would be nice if there were a way to tell somebody that they don’t understand something, without it being a status hit!
Of course, if such a thing were possible, human history and civilization would look a LOT different than they currently do. ;-)
Yeah, precisely. It’s doable if difficult in person, but I haven’t the slightest idea how to phrase it in text.