I agree. When you look up criticism of LessWrong you find plenty of very clear, pointed, and largely correct criticisms.
I used time-travel as my example because I didn’t want to upset people but really any in-group/out-group forum holding some wild ideas would have sufficed. This isn’t at Flat Earther levels yet but it’s easy to see the similarities.
There’s the unspoken things you must not say otherwise you’ll be pummeled, ignored or fought. Blatantly obvious vast holes are routinely ignored. A downvote mechanism works to push comments down.
Talking about these problems just invites people in the problems to attempt to draw you in with the flawed arguments.
Saying, hey, take three big steps back from the picture and look again doesn’t get anywhere.
Some of the posts I’ve seen on here are some sort of weird doom cosplay. A person being too scared to criticize Bing Chatgpt? Seriously? That can’t be real. It reminds me of the play-along posts I’ve seen in antivaxxer communities in a way.
The idea of “hey, maybe you’re just totally wrong” isn’t super useful to move anything but it seems obvious that fan-fiction of nanites and other super techs that exist only in stories could probably be banned and this would improve things a lot.
But beyond that, I’m not certain this place can be saved or eventually be useful. Setting up a place proclaiming it’s about rationality is interesting and can be good but it also implicitly states that those who don’t share your view are irrational, and wrong.
As the group-think develops any voice not in line is pushed out all the ways they can be pushed out and there’s never a make-or-break moment where people stand up and state outright that certain topics/claims are no longer permitted (like nanites killing us all).
The OP may be a canary, making a comment but none of the responses here produced a solution or even a path.
I’d suggest one: you can’t write nanite until we make nanites. Let’s start with that.
If you link me to 1-3 criticisms which you think are clear, pointed, and largely correct, I’ll go give them a skim at least. I’m curious. You are under no obligation to do this but if you do I’ll appreciate it.
...I have, and haven’t found anything good. (I’m ignoring the criticisms hosted on LessWrong itself, which presumably don’t count?) That’s why I asked for specific links. Now it sounds like you don’t actually have anything in mind that you think would stand up to minimal scrutiny.
The RationalWiki article does make some good points about LW having to reinvent the wheel sometimes due to ignorance of disparagement of the philosophical literature. As criticisms go this is extremely minor though… I say similar things about the complaints about Yudkowsky’s views on quantum physics and consciousness.
Do you have a specific criticism? I tried that search, and the first result goes right back to lesswrong itself, you could just link the same article. Second criticism is on the lesswrong subreddit. Third is rational wiki, where apparently some thought experiment called Roko’s Basilisk got out of hand 13 years ago.
Most of the other criticisms are “it looks like a cult”, which is a perfectly fair take, and it arguably is a cult that believes in things that happen to be more true than the beliefs of most humans. Or “a lack of application” for rationality, which was also true, pre machine learning.
I agree. When you look up criticism of LessWrong you find plenty of very clear, pointed, and largely correct criticisms.
I used time-travel as my example because I didn’t want to upset people but really any in-group/out-group forum holding some wild ideas would have sufficed. This isn’t at Flat Earther levels yet but it’s easy to see the similarities.
There’s the unspoken things you must not say otherwise you’ll be pummeled, ignored or fought. Blatantly obvious vast holes are routinely ignored. A downvote mechanism works to push comments down.
Talking about these problems just invites people in the problems to attempt to draw you in with the flawed arguments.
Saying, hey, take three big steps back from the picture and look again doesn’t get anywhere.
Some of the posts I’ve seen on here are some sort of weird doom cosplay. A person being too scared to criticize Bing Chatgpt? Seriously? That can’t be real. It reminds me of the play-along posts I’ve seen in antivaxxer communities in a way.
The idea of “hey, maybe you’re just totally wrong” isn’t super useful to move anything but it seems obvious that fan-fiction of nanites and other super techs that exist only in stories could probably be banned and this would improve things a lot.
But beyond that, I’m not certain this place can be saved or eventually be useful. Setting up a place proclaiming it’s about rationality is interesting and can be good but it also implicitly states that those who don’t share your view are irrational, and wrong.
As the group-think develops any voice not in line is pushed out all the ways they can be pushed out and there’s never a make-or-break moment where people stand up and state outright that certain topics/claims are no longer permitted (like nanites killing us all).
The OP may be a canary, making a comment but none of the responses here produced a solution or even a path.
I’d suggest one: you can’t write nanite until we make nanites. Let’s start with that.
If you link me to 1-3 criticisms which you think are clear, pointed, and largely correct, I’ll go give them a skim at least. I’m curious. You are under no obligation to do this but if you do I’ll appreciate it.
Google lesswrong criticism and you’ll find them easily enough.
...I have, and haven’t found anything good. (I’m ignoring the criticisms hosted on LessWrong itself, which presumably don’t count?) That’s why I asked for specific links. Now it sounds like you don’t actually have anything in mind that you think would stand up to minimal scrutiny.
The RationalWiki article does make some good points about LW having to reinvent the wheel sometimes due to ignorance of disparagement of the philosophical literature. As criticisms go this is extremely minor though… I say similar things about the complaints about Yudkowsky’s views on quantum physics and consciousness.
Do you have a specific criticism? I tried that search, and the first result goes right back to lesswrong itself, you could just link the same article. Second criticism is on the lesswrong subreddit. Third is rational wiki, where apparently some thought experiment called Roko’s Basilisk got out of hand 13 years ago.
Most of the other criticisms are “it looks like a cult”, which is a perfectly fair take, and it arguably is a cult that believes in things that happen to be more true than the beliefs of most humans. Or “a lack of application” for rationality, which was also true, pre machine learning.