When Eliezer Yudkowsky once woke up as Britney Spears, he recorded the world’s most-reviewed song about leveling up as a rationalist.
Eliezer Yudkowsky got Clippy to hold off on reprocessing the solar system by getting it hooked on HP:MoR, and is now writing more slowly in order to have more time to create FAI.
If you need to save the world, you don’t give yourself a handicap; you use every tool at your disposal, and you make your job as easy as you possibly can. That said, it is true that Eliezer Yudkowsky once saved the world using nothing but modal logic and a bag of suggestively-named Lisp tokens.
Eliezer Yudkowsky once attended a conference organized by some above-average Powers from the Transcend that were clueful enough to think “Let’s invite Eliezer Yudkowsky”; but after a while he gave up and left before the conference was over, because he kept thinking “What am I even doing here?”
Eliezer Yudkowsky has invested specific effort into the awful possibility that one day, he might create an Artificial Intelligence so much smarter than him that after he tells it the basics, it will blaze right past him, solve the problems that have weighed on him for years, and zip off to see humanity safely through the Singularity. It might happen, it might not. But he consoles himself with the fact that it hasn’t happened yet.
Eliezer Yudkowsky once wrote a piece of rationalist Harry Potter fanfiction so amazing that it got multiple people to actually change their lives in an effort at being more rational. (...hm’kay, perhaps that’s not quite awesome enough to be on this list… but you’ve got to admit that it’s in the neighbourhood.)
When Eliezer Yudkowsky once woke up as Britney Spears, he recorded the world’s most-reviewed song about leveling up as a rationalist.
Eliezer Yudkowsky got Clippy to hold off on reprocessing the solar system by getting it hooked on HP:MoR, and is now writing more slowly in order to have more time to create FAI.
If you need to save the world, you don’t give yourself a handicap; you use every tool at your disposal, and you make your job as easy as you possibly can. That said, it is true that Eliezer Yudkowsky once saved the world using nothing but modal logic and a bag of suggestively-named Lisp tokens.
Eliezer Yudkowsky once attended a conference organized by some above-average Powers from the Transcend that were clueful enough to think “Let’s invite Eliezer Yudkowsky”; but after a while he gave up and left before the conference was over, because he kept thinking “What am I even doing here?”
Eliezer Yudkowsky has invested specific effort into the awful possibility that one day, he might create an Artificial Intelligence so much smarter than him that after he tells it the basics, it will blaze right past him, solve the problems that have weighed on him for years, and zip off to see humanity safely through the Singularity. It might happen, it might not. But he consoles himself with the fact that it hasn’t happened yet.
Eliezer Yudkowsky once wrote a piece of rationalist Harry Potter fanfiction so amazing that it got multiple people to actually change their lives in an effort at being more rational. (...hm’kay, perhaps that’s not quite awesome enough to be on this list… but you’ve got to admit that it’s in the neighbourhood.)
When Eliezer Yudkowsky does the incredulous stare, it becomes a valid argument.