Less Wrong has discussed the meme of “SIAI agrees on ideas that most people don’t take seriously? They must be a cult!”
Awesome, it has discussed this particular ‘meme’, to prevalence of viral transmission of which your words seem to imply it attributes it’s identification as cult. Has it, however, discussed good Bayesian reasoning and understood the impact of a statistical fact that even when there is a genuine risk (if there is such risk), it is incredibly unlikely that the person most worth listening to will be lacking both academic credentials and any evidence of rounded knowledge, and also be an extreme outlier on degree of belief? There’s also the NPD diagnostic criteria to consider. The probabilities multiply here into an incredibly low probability of extreme on many parameters relevant to cult identification, for a non-cult. (For cults, they don’t multiply up because there is common cause.)
edit: to spell out details: So you start with prior maybe 0.1 probability that doomsday salvation group is noncult (and that is massive benefit of the doubt right here), then you look at the founder being such incredibly unlikely combination of traits for a non-cult doomsday caution advocate but such a typical founder for a cult—on multitude of parameters—and then you fuzzily do some knee jerk Bayesian reasoning (which however can be perfectly well replicated using a calculator instead of neuronal signals), and you end up virtually certain it is cult. That’s if you can do Bayes without doing it explicitly on calculator. Now, the reason I am here, is that I did not take a good look until very recently because I did not care if you guys are a cult or not—the cults can be interesting to argue with. And EY is not a bad guy at all, don’t take me wrong, he himself understands that he’s risking making a cult, and trying very hard NOT to make a cult. That’s very redeeming. I do feel bad for the guy, he happened to let one odd belief through, and then voila, a cult that he didn’t want. Or a semi cult, with some people in it for cult reasons and some not so much. He happened not to have formal education, or notable accomplishments that are easily to know are challenging (like being an author of some computer vision library or what ever really). He has some ideas. The cult-follower-type people are dragged towards those ideas like flies to food.
Awesome, it has discussed this particular ‘meme’, to prevalence of viral transmission of which your words seem to imply it attributes it’s identification as cult. Has it, however, discussed good Bayesian reasoning and understood the impact of a statistical fact that even when there is a genuine risk (if there is such risk), it is incredibly unlikely that the person most worth listening to will be lacking both academic credentials and any evidence of rounded knowledge, and also be an extreme outlier on degree of belief? There’s also the NPD diagnostic criteria to consider. The probabilities multiply here into an incredibly low probability of extreme on many parameters relevant to cult identification, for a non-cult. (For cults, they don’t multiply up because there is common cause.)
edit: to spell out details: So you start with prior maybe 0.1 probability that doomsday salvation group is noncult (and that is massive benefit of the doubt right here), then you look at the founder being such incredibly unlikely combination of traits for a non-cult doomsday caution advocate but such a typical founder for a cult—on multitude of parameters—and then you fuzzily do some knee jerk Bayesian reasoning (which however can be perfectly well replicated using a calculator instead of neuronal signals), and you end up virtually certain it is cult. That’s if you can do Bayes without doing it explicitly on calculator. Now, the reason I am here, is that I did not take a good look until very recently because I did not care if you guys are a cult or not—the cults can be interesting to argue with. And EY is not a bad guy at all, don’t take me wrong, he himself understands that he’s risking making a cult, and trying very hard NOT to make a cult. That’s very redeeming. I do feel bad for the guy, he happened to let one odd belief through, and then voila, a cult that he didn’t want. Or a semi cult, with some people in it for cult reasons and some not so much. He happened not to have formal education, or notable accomplishments that are easily to know are challenging (like being an author of some computer vision library or what ever really). He has some ideas. The cult-follower-type people are dragged towards those ideas like flies to food.