Don’t forget to mention that lots are also members of the cult of the frozen decapitated head and that nearly all of us hope to understand the universe better so we can make something like God but better.
I suppose that you could joke about the Weird Ideas and signal you didn’t take yourself too seriously to them, which might put them at ease if it convinces them that you aren’t implying you’re better than they are, and so their status isn’t being threatened… but that might kind of defeat the purpose by not doing anything to reduce the bias against non-conventional ideas, I’m not very good at modeling people.
I have yet to apply Bayes in any kind of formal manner that I didn’t prior to finding Less Wrong. My mental framework has still shifted significantly, and the NYC community has benefited me even though there has not been any central focus on Bayes.
Much as I love the phrase “Bayesian Conspiracy”, I don’t think it’s actually descriptive of what I’ve been participating in.
I though “Bayesian Conspiracy” was a wonderful name for the LW contingent that did Burning Man, but I’m not sure that it works that well in other contexts.
I think of the core ideal of LW as “getting better at thinking, understanding the world, and acting effectively on it”. “Rationalism” and “Rationality” are good summaries of this idea, though they do have some PR problems as the post says. “Bayes” is just too restrictive: not only it leaves out a lot of things as Raemon says, it also ties our identity too much to a particular epistemology, however powerful. If it turned out that a different epistemology is better than Bayesianism, this should not destroy LW.
Bayes-users and the Bayesian Conspiracy.
Because “I’m a member of the Bayesian Conspiracy” isn’t going to cause ANY problems with other people, right?
Don’t forget to mention that lots are also members of the cult of the frozen decapitated head and that nearly all of us hope to understand the universe better so we can make something like God but better.
Yeah, I’m pretty sure my housemates would still flee if I told them I was inviting the Bayesian Conspiracy to dinner.
I suppose that you could joke about the Weird Ideas and signal you didn’t take yourself too seriously to them, which might put them at ease if it convinces them that you aren’t implying you’re better than they are, and so their status isn’t being threatened… but that might kind of defeat the purpose by not doing anything to reduce the bias against non-conventional ideas, I’m not very good at modeling people.
Since I found this place from reading Harry Potter and the Methods of Rationality, I’ve always thought of this place as the Bayesian Conspiracy.
And I’m dying for another chapter!
Hopefully not from old age
I’m already slowly dying from that, but I’m working on it.
I have yet to apply Bayes in any kind of formal manner that I didn’t prior to finding Less Wrong. My mental framework has still shifted significantly, and the NYC community has benefited me even though there has not been any central focus on Bayes.
Much as I love the phrase “Bayesian Conspiracy”, I don’t think it’s actually descriptive of what I’ve been participating in.
I though “Bayesian Conspiracy” was a wonderful name for the LW contingent that did Burning Man, but I’m not sure that it works that well in other contexts.
I think of the core ideal of LW as “getting better at thinking, understanding the world, and acting effectively on it”. “Rationalism” and “Rationality” are good summaries of this idea, though they do have some PR problems as the post says. “Bayes” is just too restrictive: not only it leaves out a lot of things as Raemon says, it also ties our identity too much to a particular epistemology, however powerful. If it turned out that a different epistemology is better than Bayesianism, this should not destroy LW.
BUGs for short (Bayesian Users Group)