It seems like we’re all getting distracted from the main point here. It doesn’t even matter whether SBF did it, let alone why. What matters is what this says about the kind of world we live in, for the last 20 years, and, now, for the last 7 days:
I strongly suspect[4] that in ten years from now, conventional wisdom will hold the above belief as being basically cannon, regardless of further evidence in either direction. This is because it presents an intrinsically interesting, almost Hollywood villain-esque narrative, one that will surely evoke endless “hot takes” which journalists, bloggers, etc. will have a hard time passing over. Expect this to become the default understanding of what happened (from outsiders at least), and prepare accordingly.
The fact that Lesswrong is vulnerable to this, let alone EA, is deeply disturbing. Smart people are supposed to automatically coordinate around this sort of thing, because that’s what agents do, and that’s not what’s happening right now. This is basically a Quirrell moment in real life; a massive proportion of people on LW are deferring their entire worldview to obvious supervillains.
This comment had negative karma when I looked at it. I don’t think we as a community should be punishing asking honest questions, so I strong-upvoted this comment.
He’s not saying LessWrong is vulnerable to it, he’s saying it’s just what people outside of LessWrong are going to believe. He’s explicitly mentioning it so as to not necessarily take it at face value.
You are correct in that I was not explicitly saying that LessWrong is vulnerable to this (except for the fact that this assumption hasn’t really been pushed back on until nowish), but to be honest I do expect some percentage of LessWrong folks to end up believing this regardless of evidence. That’s not really a critique against the community as a whole though, because in any group, no matter how forward-thinking, you’ll find people who don’t adjust much based on evidence contrary to their beliefs.
It seems like we’re all getting distracted from the main point here. It doesn’t even matter whether SBF did it, let alone why. What matters is what this says about the kind of world we live in, for the last 20 years, and, now, for the last 7 days:
The fact that Lesswrong is vulnerable to this, let alone EA, is deeply disturbing. Smart people are supposed to automatically coordinate around this sort of thing, because that’s what agents do, and that’s not what’s happening right now. This is basically a Quirrell moment in real life; a massive proportion of people on LW are deferring their entire worldview to obvious supervillains.
Who are the obvious supervillains that they’re deferring their entire worldview to? And who’s deferring to them?
This comment had negative karma when I looked at it. I don’t think we as a community should be punishing asking honest questions, so I strong-upvoted this comment.
He’s not saying LessWrong is vulnerable to it, he’s saying it’s just what people outside of LessWrong are going to believe. He’s explicitly mentioning it so as to not necessarily take it at face value.
You are correct in that I was not explicitly saying that LessWrong is vulnerable to this (except for the fact that this assumption hasn’t really been pushed back on until nowish), but to be honest I do expect some percentage of LessWrong folks to end up believing this regardless of evidence. That’s not really a critique against the community as a whole though, because in any group, no matter how forward-thinking, you’ll find people who don’t adjust much based on evidence contrary to their beliefs.