Not a stupid question per se, but it’s beside the point of the original poster.
They aren’t suggesting that this is a choice that would actually come up for some well-formed reason; rather, they are asking “How important is rationality relative to intelligence?” and couching that question as “Would you exchange one unit of rationality (expressed as the contents of the Sequences) for N units of intelligence (expressed as IQ points)?”
Any other units of rationality and intelligence could be swapped in instead without losing the main point of the question.
That sounds accurate, but I imagine the largest region of uncertainty under discussion at the moment has to do with the practical relationship between LW-style rationality and harmonization of perceptions with the outside world.
Back to the Basics of Rationality, along with the stuff it links to, seems like it might be the closest to what you’re looking for. The more general subject of rationality outreach has come to be fairly popular, though; Effective Rationality Outreach and Tweetable Rationality are recent high-karma posts, for example. I don’t think much of a consensus on methods could be said to exist yet, although there seems to be a consensus that outreach is a good idea.
Raising the Sanity Waterline is a popular Eliezer post on a related subject, and you’ll probably see its name getting thrown around when the topic is broached.
Yes, I think that’s an excellent rephrase. Perhaps with a “To what degree...” tacked onto the front of it.
It is probably not an uncontroversial rephrase, though, since the equation of intelligence with the ability to juggle large numbers of mental objects is itself probably not uncontroversial.
That said, I endorse it.
(Though Nornagest is also correct that there’s a “are the Sequences actually good for conveying rationality?” interpretation, which I personally find a less-interesting question.)
I have indeed, and am fond of it. During my days as a technical writer, I had that list tacked up on my wall for a time.
And yeah, invoking concrete examples when things get too abstract to follow is a fine, fine thing. Worst case, it makes very clear to others where my understanding is flawed.
There is a fair bit of this sort of concrete work on LW posts—both Sequence and non—but there’s always room for more.
.
Not a stupid question per se, but it’s beside the point of the original poster.
They aren’t suggesting that this is a choice that would actually come up for some well-formed reason; rather, they are asking “How important is rationality relative to intelligence?” and couching that question as “Would you exchange one unit of rationality (expressed as the contents of the Sequences) for N units of intelligence (expressed as IQ points)?”
Any other units of rationality and intelligence could be swapped in instead without losing the main point of the question.
.
That sounds accurate, but I imagine the largest region of uncertainty under discussion at the moment has to do with the practical relationship between LW-style rationality and harmonization of perceptions with the outside world.
.
Back to the Basics of Rationality, along with the stuff it links to, seems like it might be the closest to what you’re looking for. The more general subject of rationality outreach has come to be fairly popular, though; Effective Rationality Outreach and Tweetable Rationality are recent high-karma posts, for example. I don’t think much of a consensus on methods could be said to exist yet, although there seems to be a consensus that outreach is a good idea.
Raising the Sanity Waterline is a popular Eliezer post on a related subject, and you’ll probably see its name getting thrown around when the topic is broached.
.
Yes, I think that’s an excellent rephrase. Perhaps with a “To what degree...” tacked onto the front of it.
It is probably not an uncontroversial rephrase, though, since the equation of intelligence with the ability to juggle large numbers of mental objects is itself probably not uncontroversial.
That said, I endorse it.
(Though Nornagest is also correct that there’s a “are the Sequences actually good for conveying rationality?” interpretation, which I personally find a less-interesting question.)
.
This blog is all about illustrating cognitive biases with concrete examples.
I have indeed, and am fond of it. During my days as a technical writer, I had that list tacked up on my wall for a time.
And yeah, invoking concrete examples when things get too abstract to follow is a fine, fine thing. Worst case, it makes very clear to others where my understanding is flawed.
There is a fair bit of this sort of concrete work on LW posts—both Sequence and non—but there’s always room for more.