Oh! Oh! Oh! Oh! Oh! U shitheads think you are doing something to me with your insect downvotes? Oh! Oh! Oh! Oh! Oh! The only thing U did is prove to me that U all eat SHIT! Not a single one of U can leave a sensible comment here. The next person who downvotes me is gonna get their ass beat by a kangaroo!
PeteG
To all of you who downvote my post and my comments: EAT SHIT AND DIE!
Another retarded person downvoted my post and my comment. Is this post too much for you idiots to handle, or what? You would rather read and upvote some garbage about football?
Whoever downvoted my post: you are a stupid sack of shit that did not understand anything I wrote.
Layers Of Mind
The AI tells me that I believe something with 100% certainty, but I can’t for the life of me figure out what it is. I ask it to explain, and I get: “ksjdflasj7543897502ijweofjoishjfoiow02u5”.
I don’t know if I’d believe this, but it would definitely be the strangest and scariest thing to hear.
Rationality is winning that doesn’t generate a surprise; randomly winning the lottery generates a surprise. A good measure of rationality is the amount of complexity involved in order to win, and the surprise generated by that win. If to win at a certain task requires that your method have many complex steps, and you win, non-surprisingly, then the method used was a very rational one.
“You should have spent much more of your time in this debate convincing your tangled friend that, if she were to abandon her religious belief (or belief in belief, or whatever), she would still be able to feel good about herself and good about life; that life would still be a happy meaningful place to be.”
I don’t think Eliezer cared so much to correct someone’s one wrong belief as much as he cared to correct the core that makes many such beliefs persist. Would he really have helped her if all his rational arguments failed, but his emotional one succeeded? My guess is that it wouldn’t be a win for him or her.
How to Bind Yourself To Reality is the number one thing people should GET. But my guess is that this one might not be teachable.
Most frequent would have to go to my avoidance of settling with cached thoughts. I notice, revise, and completely discard conclusions much more regularly and effectively when I recognize the conclusion was generated as soon as a question was asked.
The Wrong Question sequence was amazing. One of the very unintuitive sequences that greatly improved my categorization methods. Especially with the ‘Disguised Queries’ post.
You can’t use “humanity” and “Graham’s Number” in the same sentence.