So, I’ll say this. Eliezer’s posts put the nail in the coffin of my Theism in the matter of a couple months. My mind had to wait to be told the story of how I could sit, ponder, and wonder for night after night and allow my entire mind to get on board with a major worldview shift. Each individual post assailed the narratives I was telling myself; without the conclusions given to me up front, I was able to travel the journey to the conclusion on my own, which ended up far more compelling than previously read arguments with a more explicit agenda.
That said, now that I find myself much closer to both of your worldviews, I find your posts much more efficient, and I more often recommend yours to others over Eliezer’s. I do this because I expect more impatience on the part of others than I do in myself.
This is a problem I feel with Less Wrong in general, although I know little of what could be done. I often recommend posts to outsiders, only to receive little else than skepticism and comments at the styles or apparent agendas of the authors. “But the logic and evidence awareness make them unassailable!” does little to established narratives. Narratives which suggest that coffee-table books by authors on the New York Times bestsellers list who give TED talks are the quickest route to insight. This was never a narrative I had, and was already looking for insight in new places, and so the initial thrust of taking the time to read the Sequences was all I really needed to be carried along by them more or less to their finish line.
Your point about different audiences is at point here. For most of the world, the Sequences and the scattershot back-referenced posts here are completely irrelevant to established patterns of trawling trusted mediums for dinner-party political wisdom. But giving in to that attitude would heavily decrease the value of this site to most of its current participants. Especially me, so don’t do it, yo.
This is a problem of writing in general: different audiences require different speed of explanation and level of detail. Write for beginners, and the experts will complain that your texts advance very slowly and repeat endlessly. Write for experts, and the beginners will complain they are not able to understand.
(Even worse, write a longer text for experts because the inferential distance is high; the experts will underestimate the inferential distance, pattern-match your text with writing for beginners, complain that there were too many details they had to skip, and then write a comment showing some basic misunderstanding they wouldn’t make if they wouldn’t skip those parts. -- “You wrote a lot of stuff about AI that everyone who studied AI knows, so I skipped it. Why don’t you simply make an AI that does not have goals? Also, the superhuman AI would obviously be smart enough to invent the correct morality. What? No, I am not going to read the Sequences!”)
So when you want to recommend something to other people, you shouldn’t recommend what is best for you now, because now you are a kind of expert (at least in the LW lingo) and they are beginners.
This is analogical to when someone asks me to recommend them some good texts online to learn programming. What are good sources for me is not a good source for the beginner. I prefer encyclopedia to find the missing details; the beginner needs a textbook to explain the simple concepts one at a time.
So, I’ll say this. Eliezer’s posts put the nail in the coffin of my Theism in the matter of a couple months. My mind had to wait to be told the story of how I could sit, ponder, and wonder for night after night and allow my entire mind to get on board with a major worldview shift. Each individual post assailed the narratives I was telling myself; without the conclusions given to me up front, I was able to travel the journey to the conclusion on my own, which ended up far more compelling than previously read arguments with a more explicit agenda.
That said, now that I find myself much closer to both of your worldviews, I find your posts much more efficient, and I more often recommend yours to others over Eliezer’s. I do this because I expect more impatience on the part of others than I do in myself.
This is a problem I feel with Less Wrong in general, although I know little of what could be done. I often recommend posts to outsiders, only to receive little else than skepticism and comments at the styles or apparent agendas of the authors. “But the logic and evidence awareness make them unassailable!” does little to established narratives. Narratives which suggest that coffee-table books by authors on the New York Times bestsellers list who give TED talks are the quickest route to insight. This was never a narrative I had, and was already looking for insight in new places, and so the initial thrust of taking the time to read the Sequences was all I really needed to be carried along by them more or less to their finish line.
Your point about different audiences is at point here. For most of the world, the Sequences and the scattershot back-referenced posts here are completely irrelevant to established patterns of trawling trusted mediums for dinner-party political wisdom. But giving in to that attitude would heavily decrease the value of this site to most of its current participants. Especially me, so don’t do it, yo.
This is a problem of writing in general: different audiences require different speed of explanation and level of detail. Write for beginners, and the experts will complain that your texts advance very slowly and repeat endlessly. Write for experts, and the beginners will complain they are not able to understand.
(Even worse, write a longer text for experts because the inferential distance is high; the experts will underestimate the inferential distance, pattern-match your text with writing for beginners, complain that there were too many details they had to skip, and then write a comment showing some basic misunderstanding they wouldn’t make if they wouldn’t skip those parts. -- “You wrote a lot of stuff about AI that everyone who studied AI knows, so I skipped it. Why don’t you simply make an AI that does not have goals? Also, the superhuman AI would obviously be smart enough to invent the correct morality. What? No, I am not going to read the Sequences!”)
So when you want to recommend something to other people, you shouldn’t recommend what is best for you now, because now you are a kind of expert (at least in the LW lingo) and they are beginners.
This is analogical to when someone asks me to recommend them some good texts online to learn programming. What are good sources for me is not a good source for the beginner. I prefer encyclopedia to find the missing details; the beginner needs a textbook to explain the simple concepts one at a time.