I have come across serious criticism of the PhD programs at major universities, here on LW (and on OB). This is not quite the same as a recommendation to not enroll for a PhD, and it most certainly is not the same as a recommendation to quit from an ongoing PhD track, but I definitely interpreted such criticism as advice against taking such a PhD. Then again I have also heard similar criticism from other sources, so it might well be a genuine problem with some PhD tracks.
For what it’s worth my personal experiences with the list of main points (not sure if this should be a separate post, but I think it is worth mentioning):
Rationality doesn’t guarantee correctness.
Indeed, but as Villiam_Bur mentions this is way too high a standard. I personally notice that while not always correct I am certainly correct more often thanks to the ideas and knowledge I found at LW!
In particular, AI risk is overstated
I am not sure but I was under the impression that your suggestion of ‘just building some AI, it doesn’t have to be perfect right away’ is the thought that researchers got stuck on last century (the problem being that even making a dumb prototype was insanely complicated), when people were optimistically attempting to make an AI and kept failing. Why should our attempt be different? As for AI risk itself: I don’t know whether or not LW is blowing the risk out of proportion (in particular I do not disagree with them, I am simply unsure).
LW has a cult-like social structure.
I agree wholeheartedly, you beautifully managed to capture my feelings of unease. By targeting socially awkward nerds (such as me, I confess) it becomes unclear whether the popularity of LW among intellectuals (e.g. university students, I am looking for a better word than ‘intellectuals’ but fail to find anything) is due to genuine content or due to a clever approach to a vulnerable audience. However from my personal experience I can confidently assert that the material from LW (and OB, by the way) indeed is of high quality. So the question that remains is: if LW has good material, why does it/do we still target only a very susceptible audience? The obvious answer is that the nerds are most interested in the material discussed, but as there are many many more non-nerds than nerds it would make sense to appeal to a broader audience (at the cost of quality), right? This would probably take a lot of effort (like writing the Sequences for an audience that has trouble grasping fractions), but perhaps it would be worth it?
Many LWers are not very rational.
In my experience non-LWers are even less rational. I fear that again you have set the bar too high—reading the sequences will not make you a perfect Bayesian with Solomonoff priors, at best it will make you a bit closer of an approximation. And let me mention again that personally I have gotten decent mileage out of the sequences (but I am also counting the enjoyment I have reading the material as one of the benefits, I come here not just to learn but also to have fun).
LW membership would make me worse off.
This I mentioned earlier. I notice that you define success in terms of money and status (makes sense), and the easiest ways to try to get these would be using the ‘Dark Arts’. If you want a PhD, just guess the teachers password. It worked for me so far (although I was also interested in learning the material, so I read papers and books with understanding as a goal in my spare time). However these topics are indeed not discussed (and certainly not in the form of: ‘In order to get people to do what you want, use these three easy psychological hacks’) on LW. Would it solve your problem if such things were available?
“Art of Rationality” is an oxymoron.
Just because something is true does not mean that it is not beautiful?
I agree wholeheartedly, you beautifully managed to capture my feelings of unease. By targeting socially awkward nerds (such as me, I confess) it becomes unclear whether the popularity of LW among intellectuals (e.g. university students, I am looking for a better word than ‘intellectuals’ but fail to find anything) is due to genuine content or due to a clever approach to a vulnerable audience.
I have been contemplating this point. One of the things that sets off red flags for people outside a group is when people in the group appear to have cut’n’pasted the leader’s opinions into their heads. And that’s definitely something that happens around LW.
Note that this does not require malice or even intent on the part of said leader! It’s something happening in the heads of the recipients. But the leader needs to be aware of it—it’s part of the cult attractor, selecting for people looking for stuff to cut’n’paste into their heads.
I know this one because the loved one is pursuing ordination in the Church of England … and basically has this superpower: convincing people of pretty much anything. To the point where they’ll walk out saying “You know, black really is white, when you really think about it …” then assume that that is their own conclusion that they came to themselves, when it’s really obvious they cut’n’pasted it in. (These are people of normal intelligence, being a bit too easily convinced by a skilled and sincere arguer … but loved one does pretty well on the smart ones too.)
As I said to them, “The only reason you’re not L. Ron Hubbard is that you don’t want to be. You’d better hope that’s enough.”
Edit: The tell is not just cut’n’pasting the substance of the opinions, but the word-for-word phrasing.
I have been contemplating this point. One of the things that sets off red flags for people outside a group is when people in the group appear to have cut’n’pasted the leader’s opinions into their heads. And that’s definitely something that happens around LW.
The failure mode might be that it’s not obvious that an autodidact who spent a decade absorbing relevant academic literature will have a very different expressive range than another autodidact who spent a couple months reading the writings of the first autodidact. It’s not hard to get into the social slot of a clever outsider because the threshold for cleverness for outsiders isn’t very high.
The business of getting a real PhD is pretty good at making it clear to most people that becoming an expert takes dedication and work. Internet forums have no formal accreditation, so there’s no easy way to distinguish between “could probably write a passable freshman term paper” knowledgeable and “could take some months off and write a solid PhD thesis” knowledgeable, and it’s too easy for people in the first category to be unaware how far they are from the second category.
I have been contemplating this point. One of the things that sets off red flags for people outside a group is when people in the group appear to have cut’n’pasted the leader’s opinions into their heads. And that’s definitely something that happens around LW.
I don’t know. On the one hand side, that’s how you would expect it to look if the leader is right. On the other hand, “cult leader is right” is also how I would expect it to feel if cult leader was merely persuasive. On the third hand side, I don’t feel like I absorbed lots of novel things from cult leader, but mostly concretified notions and better terms for ideas I’d held already, and I remember many Sequences posts having a critical comment at the top.
A further good sign is that the Sequences are mostly retellings of existing literature. It doesn’t really match the “crazy ideas held for ingroup status” profile of cultishness.
The cut’n’paste not merely of the opinions, but of the phrasing is the tell that this is undigested. Possibly this could be explained by complete correctness with literary brilliance, but we’re talking about one-draft daily blog posts here.
So the question that remains is: if LW has good material, why does it/do we still target only a very susceptible audience?
This (to me) reads like you’re implying intentionality on the part of the writers to target “a very susceptible audience”. I submit the alternative hypothesis that most people who make posts here tend to be of a certain personality type (like you, I’m looking for a better term than “personality type” but failing to find anything), and as a result, they write stuff that naturally attracts people with similar personality types. Maybe I’m misreading you, but I think it’s a much more charitable interpretation than “LW is intentionally targeting psychologically vulnerable people”. As a single data point, for instance, I don’t see myself as a particularly insecure or unstable person, and I’d say I’m largely here because much of what EY (and others on LW) wrote makes sense to me, not because it makes me feel good or fuels my ego.
This would probably take a lot of effort (like writing the Sequences for an audience that has trouble grasping fractions), but perhaps it would be worth it?
With respect, I’d say this is most likely an impossible endeavor. Anyone who wants to try is welcome to, of course, but I’m just not seeing someone who can’t grok fractions being able to comprehend more than 5% of the Sequences.
I have come across serious criticism of the PhD programs at major universities, here on LW (and on OB). This is not quite the same as a recommendation to not enroll for a PhD, and it most certainly is not the same as a recommendation to quit from an ongoing PhD track, but I definitely interpreted such criticism as advice against taking such a PhD. Then again I have also heard similar criticism from other sources, so it might well be a genuine problem with some PhD tracks.
For what it’s worth my personal experiences with the list of main points (not sure if this should be a separate post, but I think it is worth mentioning):
Indeed, but as Villiam_Bur mentions this is way too high a standard. I personally notice that while not always correct I am certainly correct more often thanks to the ideas and knowledge I found at LW!
I am not sure but I was under the impression that your suggestion of ‘just building some AI, it doesn’t have to be perfect right away’ is the thought that researchers got stuck on last century (the problem being that even making a dumb prototype was insanely complicated), when people were optimistically attempting to make an AI and kept failing. Why should our attempt be different? As for AI risk itself: I don’t know whether or not LW is blowing the risk out of proportion (in particular I do not disagree with them, I am simply unsure).
I agree wholeheartedly, you beautifully managed to capture my feelings of unease. By targeting socially awkward nerds (such as me, I confess) it becomes unclear whether the popularity of LW among intellectuals (e.g. university students, I am looking for a better word than ‘intellectuals’ but fail to find anything) is due to genuine content or due to a clever approach to a vulnerable audience. However from my personal experience I can confidently assert that the material from LW (and OB, by the way) indeed is of high quality. So the question that remains is: if LW has good material, why does it/do we still target only a very susceptible audience? The obvious answer is that the nerds are most interested in the material discussed, but as there are many many more non-nerds than nerds it would make sense to appeal to a broader audience (at the cost of quality), right? This would probably take a lot of effort (like writing the Sequences for an audience that has trouble grasping fractions), but perhaps it would be worth it?
In my experience non-LWers are even less rational. I fear that again you have set the bar too high—reading the sequences will not make you a perfect Bayesian with Solomonoff priors, at best it will make you a bit closer of an approximation. And let me mention again that personally I have gotten decent mileage out of the sequences (but I am also counting the enjoyment I have reading the material as one of the benefits, I come here not just to learn but also to have fun).
This I mentioned earlier. I notice that you define success in terms of money and status (makes sense), and the easiest ways to try to get these would be using the ‘Dark Arts’. If you want a PhD, just guess the teachers password. It worked for me so far (although I was also interested in learning the material, so I read papers and books with understanding as a goal in my spare time). However these topics are indeed not discussed (and certainly not in the form of: ‘In order to get people to do what you want, use these three easy psychological hacks’) on LW. Would it solve your problem if such things were available?
Just because something is true does not mean that it is not beautiful?
I have been contemplating this point. One of the things that sets off red flags for people outside a group is when people in the group appear to have cut’n’pasted the leader’s opinions into their heads. And that’s definitely something that happens around LW.
Note that this does not require malice or even intent on the part of said leader! It’s something happening in the heads of the recipients. But the leader needs to be aware of it—it’s part of the cult attractor, selecting for people looking for stuff to cut’n’paste into their heads.
I know this one because the loved one is pursuing ordination in the Church of England … and basically has this superpower: convincing people of pretty much anything. To the point where they’ll walk out saying “You know, black really is white, when you really think about it …” then assume that that is their own conclusion that they came to themselves, when it’s really obvious they cut’n’pasted it in. (These are people of normal intelligence, being a bit too easily convinced by a skilled and sincere arguer … but loved one does pretty well on the smart ones too.)
As I said to them, “The only reason you’re not L. Ron Hubbard is that you don’t want to be. You’d better hope that’s enough.”
Edit: The tell is not just cut’n’pasting the substance of the opinions, but the word-for-word phrasing.
The failure mode might be that it’s not obvious that an autodidact who spent a decade absorbing relevant academic literature will have a very different expressive range than another autodidact who spent a couple months reading the writings of the first autodidact. It’s not hard to get into the social slot of a clever outsider because the threshold for cleverness for outsiders isn’t very high.
The business of getting a real PhD is pretty good at making it clear to most people that becoming an expert takes dedication and work. Internet forums have no formal accreditation, so there’s no easy way to distinguish between “could probably write a passable freshman term paper” knowledgeable and “could take some months off and write a solid PhD thesis” knowledgeable, and it’s too easy for people in the first category to be unaware how far they are from the second category.
I don’t know. On the one hand side, that’s how you would expect it to look if the leader is right. On the other hand, “cult leader is right” is also how I would expect it to feel if cult leader was merely persuasive. On the third hand side, I don’t feel like I absorbed lots of novel things from cult leader, but mostly concretified notions and better terms for ideas I’d held already, and I remember many Sequences posts having a critical comment at the top.
A further good sign is that the Sequences are mostly retellings of existing literature. It doesn’t really match the “crazy ideas held for ingroup status” profile of cultishness.
The cut’n’paste not merely of the opinions, but of the phrasing is the tell that this is undigested. Possibly this could be explained by complete correctness with literary brilliance, but we’re talking about one-draft daily blog posts here.
I feel like charitably, another explanation would just be that it’s simply a better phrasing than people come up with on their own.
So? Fast doesn’t imply bad. Quite the opposite, fast-work-with-short-feedback-cycle is one of the best ways to get really good.
This (to me) reads like you’re implying intentionality on the part of the writers to target “a very susceptible audience”. I submit the alternative hypothesis that most people who make posts here tend to be of a certain personality type (like you, I’m looking for a better term than “personality type” but failing to find anything), and as a result, they write stuff that naturally attracts people with similar personality types. Maybe I’m misreading you, but I think it’s a much more charitable interpretation than “LW is intentionally targeting psychologically vulnerable people”. As a single data point, for instance, I don’t see myself as a particularly insecure or unstable person, and I’d say I’m largely here because much of what EY (and others on LW) wrote makes sense to me, not because it makes me feel good or fuels my ego.
With respect, I’d say this is most likely an impossible endeavor. Anyone who wants to try is welcome to, of course, but I’m just not seeing someone who can’t grok fractions being able to comprehend more than 5% of the Sequences.