This whole article makes a sleight of hand assumption that more rational = more time on LW.
Yvain isn’t talking about rationality, he’s talking about membership in a rationalist group. (He says “training”, but he’s looking at time and status in community, not any specific training regime.) That “-ist” is important: it denotes a specific ideology or methodology. In this case, that’s one that’s strongly associated with the LW community, so using time and karma isn’t a bad measure of one’s exposure to it.
Myself, I’d be interested to see how these numbers compare to CFAR alumni. There’s some overlap, but not so much as to rule out important differences.
I dislike this usage, and in fact I find it offensive. Even with the “-ist” appended, It’s an appropriation of a term that has a general meaning of “thinking clearly” which gets redefined as a label of membership into a given community.
Personally, I’m more bothered by the fact that it shares a name with an epistemological stance) that’s in most ways unrelated and in some ways actually opposed to the LW methodology. (We tend to favor empiricist approaches in most situations.) But that ship has sailed.
Yvain isn’t talking about rationality, he’s talking about membership in a rationalist group.
My understanding is that one’s rationality (or ability to be rational) would increase as a result of participation in rationalist training. Hence, I see your disctinction, but little, if any, difference.
In this case, he assumes (1) LW is rationalist and (2) LW is good at providing training that makes a participating member more rational.
Karma does not necessarily have anything to do with rationality, being rational, rationalist training, etc. It is a point system in which members of LW give points to stuff they want more of. It has also been used as a reward for doing tasks for free for LW, mass blocks of downvoting for dissenting political views, and even filling out the survey we are talking about in this post.
Yvain isn’t talking about rationality, he’s talking about membership in a rationalist group. (He says “training”, but he’s looking at time and status in community, not any specific training regime.) That “-ist” is important: it denotes a specific ideology or methodology. In this case, that’s one that’s strongly associated with the LW community, so using time and karma isn’t a bad measure of one’s exposure to it.
Myself, I’d be interested to see how these numbers compare to CFAR alumni. There’s some overlap, but not so much as to rule out important differences.
I dislike this usage, and in fact I find it offensive.
Even with the “-ist” appended, It’s an appropriation of a term that has a general meaning of “thinking clearly” which gets redefined as a label of membership into a given community.
Personally, I’m more bothered by the fact that it shares a name with an epistemological stance) that’s in most ways unrelated and in some ways actually opposed to the LW methodology. (We tend to favor empiricist approaches in most situations.) But that ship has sailed.
My understanding is that one’s rationality (or ability to be rational) would increase as a result of participation in rationalist training. Hence, I see your disctinction, but little, if any, difference.
In this case, he assumes (1) LW is rationalist and (2) LW is good at providing training that makes a participating member more rational.
Karma does not necessarily have anything to do with rationality, being rational, rationalist training, etc. It is a point system in which members of LW give points to stuff they want more of. It has also been used as a reward for doing tasks for free for LW, mass blocks of downvoting for dissenting political views, and even filling out the survey we are talking about in this post.
...(3) No one turns up as a newbie at LW having already learnt rationality.
That it, is in fact, the question Yvain is discussing.