Something like this also happened with Event Horizon, though the metamorphosis is not yet complete...
katydee
Once every few days or so.
Broadly agreed—this is one of the main reasons I consider internal transparency to be so important in building effective organizations. in some cases, secrets must exist—but when they do, their existence should itself be common knowledge unless even that must be secret.
In other words, it is usually best to tell your teammates the true reason for something, and failing that you should ideally be able to tell them that you can’t tell them. Giving fake reasons is poisonous.
In some cases it can be—and I will discuss this further in a later post. However, there are many situations where the problems you’re encountering are cleanly solved by existing paradigms, and looking at things from first principles leads only to reinventing the wheel. For instance, the appropriate paradigm for running a McDonald’s franchise is extremely understood, and there is little need (or room) for innovation in such a context.
Strategic Thinking: Paradigm Selection
Did social desirability effects mask Trump’s true support?
This is one of the worst comments I’ve seen on LessWrong and I think the fact that this is being upvoted is disgraceful. (Note: this reply refers to a comment that has since been deleted.)
Excellent link.
This post seems better suited for the Discussion section.
So, maybe this is just my view of things, but I think a big part of this conversation is whether you’re outside looking in or inside looking out.
I’m on the inside and I think we should get rid of these things for the sake of both insiders and outsiders.
Is that true? I mostly don’t notice people scoring cheap points by criticizing religion; I mostly notice them ignoring religion.
See for instance Raising the Sanity Waterline, a post which raises very important points but is so unnecessarily mean-spirited towards religion that I can’t particularly show it to many people. As Eliezer writes elsewhere:
Why would anyone pick such a distracting example to illustrate nonmonotonic reasoning? Probably because the author just couldn’t resist getting in a good, solid dig at those hated Greens.
In terms of weird fixations, there are quite a few strange things that the LW community seems to have as part of its identity—polyamory and cryonics are perhaps the best examples of things that seem to have little to do with rationality but are widely accepted as norms here.
If you think rationality leads you to poly or to cryo, I’m fine with that, but I’m not fine with it becoming such a point of fixation or an element of group identity.
For that matter, I think atheism falls into the same category. Religion is basically politics, and politics is the mind-killer, but people here love to score cheap points by criticizing religion. The fact that things like the “secular solstice” have become part of rationalist community norms and identity is indicative of serious errors IMO.
For me, one of the most appealing things about EA (as opposed to rationalist) identity is that it’s not wrapped up in all this unnecessary weird stuff.
I think LessWrong has a lot of annoying cultural problems and weird fixations, but despite those problems I think there really is something to be gained from having a central place for discussion.
The current “shadow of LessWrong + SSC comments + personal blogs + EA forum + Facebook + IRC (+ Tumblr?)” equilibrium seems to have in practice led to much less mutual knowledge of cool articles/content being written, and perhaps to less cool articles/content as well.
I’d really like to see a revitalization of LessWrong (ideally with a less nitpicky culture and a lack of weird fixations) or the establishment of another central hub site, but even failing that I think people going back to LW would probably be good on net.
My impression significantly differs, though I’m far from confident. I’d be interested in seeing an expanded version of this point because it seems potentially very valuable to me.
I agree, that comment was written during a somewhat silly period of my life. :)
This post seems more suited for the Discussion section (insofar as it is suitable for LW at all).
[LINK] Yudkowsky’s Abridged Guide to Intelligent Characters
It went very well—too well, in fact! Writing a LessWrong post did not feel alive to me, so I didn’t do it.
Great post! I’d love to see this in the Main section.
Would you avoid making yourself better at thinking because you might start winning arguments by bamboozling your opponent?
I do avoid making myself better at arguing for this reason. Thinking is another story.
With respect to power dynamics point one and two, there is another person known to the community who is perhaps more qualified and already running something which is similar in several respects—Geoff Anders of Leverage Research. So I don’t think this is precisely the only group making an attempt to hit this sort of thing, though I still find it novel and interesting.
(disclaimer: I was at the test weekend for this house and am likely to participate)