If you had to pick exactly 20 articles from LessWrong to provide the greatest added value for a reader, which 20 articles would you select?
In other words, I am asking you to pick “Sequences: Micro Edition” for new readers, or old readers who feel intimidated by the size and structure of Sequences. No sequences and subsequences, just 20 selected articles that should be read in the given order.
It is important to consider that some information is distributed in many articles, and some articles use information explained in previous articles. Your selection should make sense for people who have read nothing else on LW, and cannot click on hyperlinks for explanation (as if they are reading the articles on a paper, without comments). Do the introductory articles provide enough value even if you won’t put the whole sequence to the selected 20? Is it better to pick examples from more topics, or focus on one?
Yes, I am hoping that reading those 20 articles would encourage the reader to read more, perhaps even the whole Sequences. But the 20 articles should provide enough value when taken alone; they should be a “food”, not just an “appetizer”.
It is OK to pick also those LW articles that are not part of the traditional Sequences. It is OK to suggest less than 20 articles. (Suggesting more than 20 is not OK, because the goal is to select a small number of articles that provide value without reading anything more.)
Now let’s try it differently. Even if you feel that 20 articles is too small subset to describe the richness of this site, let’s push it even further. Imagine that you can only list 10 articles, or 7 articles, 5 articles, 3 articles, or just 1 single best articles of the LessWrong. It will be painful, but please do your best.
Why? Well, unless one of us puts their selection of 20 articles on the wiki ignoring the others, the resulting selection will be a mix of something that you would select and something that you wouldn’t. The resulting 20 articles will contain only 10 or maybe less articles from your personal “top 20” selection. So let’s make it the best 10 articles.
However I ask you to avoid using strategies like this: “I think articles A and B are good. A is better than B, so if I have to choose only one article, I should have chosen A. But article A is widely popular, and most other people will probably choose it too, therefore I will pick B, which maximizes the chance that both A and B will be in the final selection.” Please avoid this. Just pretend that the remaining articles will be chosen randomly (even if other people have already posted their choices), so you should really choose what you prefer most. Please cooperate on this Prisonner’s Dilemma.
Also, please explain your reason behind selecting those articles. Maybe you see an aspect others are missing. Maybe others can suggest you another article which fulfills your goal better. (In other words, if you explain yourself, others can extrapolate your volition.)
Why truth? And… -- it contains motivation for doing what we do, and explains the “Spock Rationality” misunderstanding
An Intuitive Explanation of Bayes’ Theorem—a biology/medicine example focusing on women, and an interactive math textbook (great to balance the LW bias: male, sci-fi, computers, impractical philosophy, nonstandard science)
Why Our Kind Can’t Cooperate—a frequent fail mode of unknowingly trying to reverse stupidity in real life, important for those who hope to have a rational community
then these:
How to Be Happy—a lot of low-hanging fruit for a new reader, applying science to everyday life; bonus points for being written by someone else
Something to Protect—bringing the motivation to the near mode; the moral aspect of becoming rational
How to Beat Procrastination—an important topic for many people online, and also very popular one (might bring hyperlinks to LW)
Note: I think that each these articles can be read and understood separately, which in my opinion is good for total newbies. People are expecting short inferential distance, and you must first gain their attention before you can lead them further. If they will enjoy the MicroSequences, they will more likely continue with the Sequences. I also think these articles are not controversial or weird, so they will give a good impression to an outsider. The selection includes math, instrumental rationality, social aspects of rationality.
Funny thing, it was rather painful to reduce my suggested list to only 10 articles, but now I feel happy and satisfied with the result. Please make your own list, independently of this one. (Imagine that you have to select 10 or less articles for your friend.)
Expecting Short Inferential Distances (I think maybe this one and Knowing About Biases Can Hurt People should maybe be the first parts of the Sequences everybody reads.)
What are the ones you would include if you were including repeats? (Viliam_Bur is asking for an absolute top 20, not several independent lists of good posts.)
Who exactly is “Simple Truth” aimed at? As far as I can tell, the message is that worrying about the cashing out the meaning of truth is not worth the effort in ordinary circumstances. That’s true, but it is a fully generalizable counter-argument to studying anything—worrying about the meaning of “quantum configuration” has no practical payoff, even though building things like computers relies on studying those sorts of things. Likewise, the meaning of truth is really hard if you actually examine it.
Put differently, religious people don’t disagree with us about truth means, they disagree about what is actually true. And they are wrong, for the reasons detailed in “Making Beliefs Pay Rent.” In short, no real person is analogous to Mark, so no real person’s philosophical positions are contradicted by the story.
To repeat, the story doesn’t solve any real questions about truth, it simply says they are practically [Edit] unimportant (which is true, but makes the story itself pretty unhelpful),
For me the message of “Simple Truth” was that the intelligence should not be used to defeat itself. To be right, even if you can’t define it to philosopher’s satisfaction, is better than to be wrong, even if you can find some smart words to support that. The truth business is not about words (that’s signalling business), but when you are right, nature rewards you, and when you are wrong, nature punishes you. (Although among humans, speaking truth can cause you a lot of trouble.) At the same time it explains the origins of our ability to understand truth—we have this ability because having it was an evolutionary advantage.
Or maybe I just like that the annoying wise-ass guy dies in the end.
This is not about religious people, who disagree about what is actually true, as you said. This is about people who try to do “philosophy” by inventing more complex ways to sound stupid… errr… profound, and perhaps they even sometimes succeed to convince themselves. People who say things like “there is no truth”, because for anything you say they can generate a long sequence of words that you just don’t have time to analyze and debunk (and even if you did, they would just use a fraction of that time to generate a new sequence of words). If you didn’t meet such people, consider yourself lucky, but I know people who can role-play Mark and thus ruin any chance of a rational discussion, and for a non-x-rational listener it often seems like their arguments are rather important and deep, and should be addressed seriously.
Anyway, the “Simple Truth” is kinda long, which I enjoyed, but other people may hate; so it is probably no harm in removing it, as long as “Making Beliefs Pay Rent” and “Something to Protect” stays in the list.
the intelligence should not be used to defeat itself
I agree with this feeling, but “Do the impossible” or one of the nearby posts raises this point more explicitly and more effectively.
The problem with “Simple Truth” is that—beyond the message I highlighted - the text is too open ended. Mirror-like, the story contains whatever philosophical positions the reader wishes to see in it.
I know people who can role-play Mark
There are two possible kinds of people who can do this. (1) People with useful but complicated theories that you happen not to understand, and (2) stupid people—who might be poorly parroting a useful theory. Please don’t let the (negative) halo effect of the second type infect your view of the first type of people.
Generally, your objection pattern matches with the argument that law is too complicated. Respectfully, I disagree.
I think you mean “practically unimportant” in your last sentence.
I’ve always understood the purpose of that article to be to pre-emptively foreclose objections of the form “but being rational is irrelevant, because you can’t really know what’s true” by declaring them rhetorically out-of-bounds.
I’ve always taken the objection you mentioned as invoking the problem of reliability of the sense (i.e. Cartesian skepticism), not the meaningfulness of truth. In the story, Mark is no Cartesian skeptic (of course, it’s hard to tell, because Mark is a terribly confused person)
I think skeptical objections to Bayesian reasoning are like questions about the origin of life directed at evolutionary theory. The criticisms aren’t exactly wrong—it’s just that the theory targeted by the criticism is not trying to provide an answer on that issue.
These, but that’s probably not the best way to go about making a list. Many of the top posts require prerequisites, and there are some equally good posts that are not as heavily upvoted because they were published on OB or in LW’s infancy.
I actually started working on something similar, but it never really took off and real-world responsibilities prevented me from working on it for a while. Feel free to pick up where I left off. Anyway, here’s my first attempt (I may try again later):
I don’t know if the intention here is to debate other people’s choices, but: my wife started The Simple Truth because it was the first sequence post on the list and quickly became frustrated and annoyed that it didn’t seem to lead anywhere and seemed to be composed of “in jokes.” She didn’t try to read further into the Sequences because of the bad impression she got off this article, which is an unusually weird, long, rambling, quirky article.
I actually like The Simple Truth but I don’t feel that it makes a good introduction to the Sequences. But hey, this is just one data point.
I predict that when your wife read “The Simple Truth” she was not acquainted with (or was not thinking about) the various theories of truth
that philosophers have come up with. I like it a lot, but
when I first read it I was able to see it as a defense of a
particular theory of truth and a critique of some other
ones.
Edit: In other words, I think “The Simple Truth”
appeals mainly to people who have read descriptions of the
other theories of truth and said to themselves, “People
actually believe that?!”
You’re correct. What I love about the Sequences in general is that it’s a colloquial, patient introduction to lots of new concepts. In theory, even somebody with no background in decision theories or quantum mechanics can actually learn these concepts from the Sequences. The Simple Truth is significantly different in tone and style from the majority of Sequence posts and the concepts which that post satirizes are not really introduced before the comedy begins.
If you go to http://wiki.lesswrong.com/wiki/Sequences and choose the first option (1 Core Sequences), then choose the first listed subsequence (Map and Territory), the very first post is The Simple Truth. The second choice is What Do We Mean by Rationality? which really, really seems like it should be the first thing a newcomer reads.
I actually like The Simple Truth but I don’t feel that it makes a good introduction to the Sequences.
Same here, though I think it does depend on the readers background. People who strongly disbelieve in the concept of objective truth might find it helpful to have that taken care of before starting the sequences proper, but even then I’m not sure if the simple truth is the best way.
You might be right—I’ll have to re-read it. I put this list together based on my memory of what these posts are like, and given how volatile memories are, I may be mistaken about their quality.
Edit: You’re right. I’ll change my list accordingly.
What is your intended audience, and what is the intended effect of reading these sequences? “Politics is the Mind-Killer” and “Well-Kept gardens die by pacifism” seem particularly relevant to online communities, for instance.
It was intended for new people on LW, who should be introduced to our “community values” (even without reading the whole Sequences). Also for smart people outside LW, who are curious what is LW about; and might decide to join later.
In both cases, the goal is make clear what LW (and x-rationality) is, and what it is not, in a short amount of text. Perhaps writing a new text would be better, but making a selection of existing text should be quicker.
“Politics is the Mind-Killer” and “Well-Kept gardens die by pacifism” seem particularly relevant to online communities, for instance.
Yes, but I think they also apply well offline. People can discuss politics in person, too. The lesson of well-kept gardens is indirect: some people are net loss, and if you don’t filter them out of your social network, your quality of life will go down.
Now I added some explanations to my list, so the message is like this:
there is such thing as a truth/territory, and it has consequences in real life
to know = to make good predictions
it’s not about speaking mysteriously or using the right keywords, but about understanding the details
protect your values, don’t use your intelligence to defeat yourself
don’t let your emotions and biases make you stupid, but also don’t try to reverse stupidity
a rational community is a great idea, but it requires specific skills
here is how to use rationality to improve your everyday life
Nice idea—but maybe we should compress things further? I’ve read most of the sequences, but think/hope they could be condensed to about 10-20 pages with the core messages, in such a way that would be more accessible outside these realms.
I guess the idea is to find 20 articles that provide both ideas and arguments out of those that already exist. After there is some solution, it becomes easier to write those 20 pages than when starting from scratch. Obviously, 20 paper pages you mention have yet to be written—and “20 best articles for isolated reading” may be written already.
If you had to pick exactly 20 articles from LessWrong to provide the greatest added value for a reader, which 20 articles would you select?
In other words, I am asking you to pick “Sequences: Micro Edition” for new readers, or old readers who feel intimidated by the size and structure of Sequences. No sequences and subsequences, just 20 selected articles that should be read in the given order.
It is important to consider that some information is distributed in many articles, and some articles use information explained in previous articles. Your selection should make sense for people who have read nothing else on LW, and cannot click on hyperlinks for explanation (as if they are reading the articles on a paper, without comments). Do the introductory articles provide enough value even if you won’t put the whole sequence to the selected 20? Is it better to pick examples from more topics, or focus on one?
Yes, I am hoping that reading those 20 articles would encourage the reader to read more, perhaps even the whole Sequences. But the 20 articles should provide enough value when taken alone; they should be a “food”, not just an “appetizer”.
It is OK to pick also those LW articles that are not part of the traditional Sequences. It is OK to suggest less than 20 articles. (Suggesting more than 20 is not OK, because the goal is to select a small number of articles that provide value without reading anything more.)
Now let’s try it differently. Even if you feel that 20 articles is too small subset to describe the richness of this site, let’s push it even further. Imagine that you can only list 10 articles, or 7 articles, 5 articles, 3 articles, or just 1 single best articles of the LessWrong. It will be painful, but please do your best.
Why? Well, unless one of us puts their selection of 20 articles on the wiki ignoring the others, the resulting selection will be a mix of something that you would select and something that you wouldn’t. The resulting 20 articles will contain only 10 or maybe less articles from your personal “top 20” selection. So let’s make it the best 10 articles.
However I ask you to avoid using strategies like this: “I think articles A and B are good. A is better than B, so if I have to choose only one article, I should have chosen A. But article A is widely popular, and most other people will probably choose it too, therefore I will pick B, which maximizes the chance that both A and B will be in the final selection.” Please avoid this. Just pretend that the remaining articles will be chosen randomly (even if other people have already posted their choices), so you should really choose what you prefer most. Please cooperate on this Prisonner’s Dilemma.
Also, please explain your reason behind selecting those articles. Maybe you see an aspect others are missing. Maybe others can suggest you another article which fulfills your goal better. (In other words, if you explain yourself, others can extrapolate your volition.)
My choice, the most important three articles:
Why truth? And… -- it contains motivation for doing what we do, and explains the “Spock Rationality” misunderstanding
An Intuitive Explanation of Bayes’ Theorem—a biology/medicine example focusing on women, and an interactive math textbook (great to balance the LW bias: male, sci-fi, computers, impractical philosophy, nonstandard science)
Why Our Kind Can’t Cooperate—a frequent fail mode of unknowingly trying to reverse stupidity in real life, important for those who hope to have a rational community
then these:
How to Be Happy—a lot of low-hanging fruit for a new reader, applying science to everyday life; bonus points for being written by someone else
Something to Protect—bringing the motivation to the near mode; the moral aspect of becoming rational
Well-Kept Gardens Die By Pacifism—a frequent fail mode of online communities; explanation of the LW moderation system
and then these:
Making Beliefs Pay Rent (in Anticipated Experiences) -- the difference between a useful and useless belief, and how to avoid discussing mere words
Knowing About Biases Can Hurt People—warning about a possible fail mode for people who enjoy reading the articles about (other people’s) biases
Guessing the Teacher’s Password—education is an important topic for many people
How to Beat Procrastination—an important topic for many people online, and also very popular one (might bring hyperlinks to LW)
Note: I think that each these articles can be read and understood separately, which in my opinion is good for total newbies. People are expecting short inferential distance, and you must first gain their attention before you can lead them further. If they will enjoy the MicroSequences, they will more likely continue with the Sequences. I also think these articles are not controversial or weird, so they will give a good impression to an outsider. The selection includes math, instrumental rationality, social aspects of rationality.
Funny thing, it was rather painful to reduce my suggested list to only 10 articles, but now I feel happy and satisfied with the result. Please make your own list, independently of this one. (Imagine that you have to select 10 or less articles for your friend.)
OK, my first shot, probably just to encourage people to do better than this:
The Simple Truth (humans love storytelling)
Why truth? And… (the motivation for what we do)
What Do We Mean By “Rationality”?
An Intuitive Explanation of Bayes’ Theorem (medical examples = serious business)
Making Beliefs Pay Rent (in Anticipated Experiences) (important point)
The Virtue of Narrowness (important point)
Guessing the Teacher’s Password (people will agree with this)
Universal Fire
Universal Law
Mind Projection Fallacy
Politics is the Mind-Killer
Applause Lights (an example of mindkilling)
Affective Death Spirals
Reversed Stupidity Is Not Intelligence (important point)
Uncritical Supercriticality (mindkilling in extreme)
Knowing About Biases Can Hurt People (important point)
Something to Protect (our connection to reality)
Why Our Kind Can’t Cooperate (community building)
Well-Kept Gardens Die By Pacifism (community protecting)
3 Levels of Rationality Verification
Practical Advice Backed By Deep Theories (instrumental value of LW)
How to Be Happy (something you can try at home)
EDIT: Oops, it was more than 20, I was in a hurry. The more important (IMHO) articles are now marked by a bold font, with explanation added.
Seems to me like you need Mysterious Answers to Mysterious Questions in there. That’s far and beyond one of my favorites.
(You can single-space your posts by putting two spaces at the end of each line. Do this, for it will save scrolltime.)
I’m going to avoid repeating ones on your list, entirely because I think repetition is bad. Here I go:
Twelve Virtues of Rationality (my favorite thing Eliezer has ever written bar none)
A Fable of Science and Politics
Transhumanism as Simplified Humanism
Absense of Evidence Is Evidence of Absense
The Fable of the Dragon-Tyrant
The Proper Use of Humility
Expecting Short Inferential Distances (I think maybe this one and Knowing About Biases Can Hurt People should maybe be the first parts of the Sequences everybody reads.)
Truly Part of You
Newcomb’s Problem and Regret of Rationality
Three Dialogues on Identity (another favorite of mine)
The 5-second Level
Cached Selves
Learned Blankness
Why You’re Stuck in a Narrative
Your Inner Google
Errors vs. Bugs and the End of Stupidity
Yes, a Blog (more of an appetizer, I guess)
A Parable on Obsolete Ideologies
The trouble with picking stand-alone posts is that Eliezer’s sequences of posts are so much better.
What are the ones you would include if you were including repeats? (Viliam_Bur is asking for an absolute top 20, not several independent lists of good posts.)
Who exactly is “Simple Truth” aimed at? As far as I can tell, the message is that worrying about the cashing out the meaning of truth is not worth the effort in ordinary circumstances. That’s true, but it is a fully generalizable counter-argument to studying anything—worrying about the meaning of “quantum configuration” has no practical payoff, even though building things like computers relies on studying those sorts of things. Likewise, the meaning of truth is really hard if you actually examine it.
Put differently, religious people don’t disagree with us about truth means, they disagree about what is actually true. And they are wrong, for the reasons detailed in “Making Beliefs Pay Rent.” In short, no real person is analogous to Mark, so no real person’s philosophical positions are contradicted by the story.
To repeat, the story doesn’t solve any real questions about truth, it simply says they are practically [Edit] unimportant (which is true, but makes the story itself pretty unhelpful),
For me the message of “Simple Truth” was that the intelligence should not be used to defeat itself. To be right, even if you can’t define it to philosopher’s satisfaction, is better than to be wrong, even if you can find some smart words to support that. The truth business is not about words (that’s signalling business), but when you are right, nature rewards you, and when you are wrong, nature punishes you. (Although among humans, speaking truth can cause you a lot of trouble.) At the same time it explains the origins of our ability to understand truth—we have this ability because having it was an evolutionary advantage.
Or maybe I just like that the annoying wise-ass guy dies in the end.
This is not about religious people, who disagree about what is actually true, as you said. This is about people who try to do “philosophy” by inventing more complex ways to sound stupid… errr… profound, and perhaps they even sometimes succeed to convince themselves. People who say things like “there is no truth”, because for anything you say they can generate a long sequence of words that you just don’t have time to analyze and debunk (and even if you did, they would just use a fraction of that time to generate a new sequence of words). If you didn’t meet such people, consider yourself lucky, but I know people who can role-play Mark and thus ruin any chance of a rational discussion, and for a non-x-rational listener it often seems like their arguments are rather important and deep, and should be addressed seriously.
Anyway, the “Simple Truth” is kinda long, which I enjoyed, but other people may hate; so it is probably no harm in removing it, as long as “Making Beliefs Pay Rent” and “Something to Protect” stays in the list.
I agree with this feeling, but “Do the impossible” or one of the nearby posts raises this point more explicitly and more effectively.
The problem with “Simple Truth” is that—beyond the message I highlighted - the text is too open ended. Mirror-like, the story contains whatever philosophical positions the reader wishes to see in it.
There are two possible kinds of people who can do this. (1) People with useful but complicated theories that you happen not to understand, and (2) stupid people—who might be poorly parroting a useful theory. Please don’t let the (negative) halo effect of the second type infect your view of the first type of people.
Generally, your objection pattern matches with the argument that law is too complicated. Respectfully, I disagree.
I think you mean “practically unimportant” in your last sentence.
I’ve always understood the purpose of that article to be to pre-emptively foreclose objections of the form “but being rational is irrelevant, because you can’t really know what’s true” by declaring them rhetorically out-of-bounds.
Indeed a typo, thanks.
I’ve always taken the objection you mentioned as invoking the problem of reliability of the sense (i.e. Cartesian skepticism), not the meaningfulness of truth. In the story, Mark is no Cartesian skeptic (of course, it’s hard to tell, because Mark is a terribly confused person)
I think skeptical objections to Bayesian reasoning are like questions about the origin of life directed at evolutionary theory. The criticisms aren’t exactly wrong—it’s just that the theory targeted by the criticism is not trying to provide an answer on that issue.
I’d add something like Keep your identity small, Beware Identity.
Which twenty have the highest number of votes?
These, but that’s probably not the best way to go about making a list. Many of the top posts require prerequisites, and there are some equally good posts that are not as heavily upvoted because they were published on OB or in LW’s infancy.
I actually started working on something similar, but it never really took off and real-world responsibilities prevented me from working on it for a while. Feel free to pick up where I left off. Anyway, here’s my first attempt (I may try again later):
Twelve Virtues of Rationality
The Cognitive Science of Rationality
The Bottom Line
Making Beliefs Pay Rent
An Intuitive Explanation of Bayes’ Theorem
Knowing About Biases Can Hurt People
A Fable of Science and Politics
Hindsight Devalues Science
Taboo Your Words
The Least Convenient Possible World
The Apologist and the Revolutionary
Mind Projection Fallacy
Confidence Levels Inside and Outside an Argument
The Fallacy of Gray
Ugh Fields
Cached selves
Conjunction Fallacy
Understanding Your Understanding
Humans are not automatically strategic
How to Beat Procrastination
I don’t know if the intention here is to debate other people’s choices, but: my wife started The Simple Truth because it was the first sequence post on the list and quickly became frustrated and annoyed that it didn’t seem to lead anywhere and seemed to be composed of “in jokes.” She didn’t try to read further into the Sequences because of the bad impression she got off this article, which is an unusually weird, long, rambling, quirky article.
I actually like The Simple Truth but I don’t feel that it makes a good introduction to the Sequences. But hey, this is just one data point.
I predict that when your wife read “The Simple Truth” she was not acquainted with (or was not thinking about) the various theories of truth that philosophers have come up with. I like it a lot, but when I first read it I was able to see it as a defense of a particular theory of truth and a critique of some other ones.
(In particular, it’s a defense of the correspondence theory, though see this thread.)
Edit: In other words, I think “The Simple Truth” appeals mainly to people who have read descriptions of the other theories of truth and said to themselves, “People actually believe that?!”
You’re correct. What I love about the Sequences in general is that it’s a colloquial, patient introduction to lots of new concepts. In theory, even somebody with no background in decision theories or quantum mechanics can actually learn these concepts from the Sequences. The Simple Truth is significantly different in tone and style from the majority of Sequence posts and the concepts which that post satirizes are not really introduced before the comedy begins.
If you go to http://wiki.lesswrong.com/wiki/Sequences and choose the first option (1 Core Sequences), then choose the first listed subsequence (Map and Territory), the very first post is The Simple Truth. The second choice is What Do We Mean by Rationality? which really, really seems like it should be the first thing a newcomer reads.
Same here, though I think it does depend on the readers background. People who strongly disbelieve in the concept of objective truth might find it helpful to have that taken care of before starting the sequences proper, but even then I’m not sure if the simple truth is the best way.
You might be right—I’ll have to re-read it. I put this list together based on my memory of what these posts are like, and given how volatile memories are, I may be mistaken about their quality.
Edit: You’re right. I’ll change my list accordingly.
What is your intended audience, and what is the intended effect of reading these sequences? “Politics is the Mind-Killer” and “Well-Kept gardens die by pacifism” seem particularly relevant to online communities, for instance.
It was intended for new people on LW, who should be introduced to our “community values” (even without reading the whole Sequences). Also for smart people outside LW, who are curious what is LW about; and might decide to join later.
In both cases, the goal is make clear what LW (and x-rationality) is, and what it is not, in a short amount of text. Perhaps writing a new text would be better, but making a selection of existing text should be quicker.
Yes, but I think they also apply well offline. People can discuss politics in person, too. The lesson of well-kept gardens is indirect: some people are net loss, and if you don’t filter them out of your social network, your quality of life will go down.
Now I added some explanations to my list, so the message is like this:
there is such thing as a truth/territory, and it has consequences in real life
to know = to make good predictions
it’s not about speaking mysteriously or using the right keywords, but about understanding the details
protect your values, don’t use your intelligence to defeat yourself
don’t let your emotions and biases make you stupid, but also don’t try to reverse stupidity
a rational community is a great idea, but it requires specific skills
here is how to use rationality to improve your everyday life
Nice idea—but maybe we should compress things further? I’ve read most of the sequences, but think/hope they could be condensed to about 10-20 pages with the core messages, in such a way that would be more accessible outside these realms.
I guess the idea is to find 20 articles that provide both ideas and arguments out of those that already exist. After there is some solution, it becomes easier to write those 20 pages than when starting from scratch. Obviously, 20 paper pages you mention have yet to be written—and “20 best articles for isolated reading” may be written already.