What is that quote of Scott’s… Something about how the sequences obsolete themselves. And that he remembers the sequences being full of all these great insights about difficult topics—but when he goes back and rereads them, it’s all just so obvious.
You probably see where I’m going with this. It seems entirely possible that when you say “oh, it’s easy, you just notice when you’re making a hypothesis that might be too strong and then come up with a way to test it,” you are in fact doing the complete content of that sequence post that seeemed insightful way back when, it’s just that it’s easy to you now.
That’s part of it, but also Eliezer sometimes makes things sound more complicated than they are. This exchange is a nice example.
Eliezer: And if you think you can explain the concept of “systematically underestimated inferential distances” briefly, in just a few words, I’ve got some sad news for you...
enye-word: “This is going to take a while to explain.” Did I do it? Did I win rationalism?!
What is that quote of Scott’s… Something about how the sequences obsolete themselves. And that he remembers the sequences being full of all these great insights about difficult topics—but when he goes back and rereads them, it’s all just so obvious.
You probably see where I’m going with this. It seems entirely possible that when you say “oh, it’s easy, you just notice when you’re making a hypothesis that might be too strong and then come up with a way to test it,” you are in fact doing the complete content of that sequence post that seeemed insightful way back when, it’s just that it’s easy to you now.
That’s part of it, but also Eliezer sometimes makes things sound more complicated than they are. This exchange is a nice example.
Eliezer: And if you think you can explain the concept of “systematically underestimated inferential distances” briefly, in just a few words, I’ve got some sad news for you...
enye-word: “This is going to take a while to explain.” Did I do it? Did I win rationalism?!