I am relatively new to the community, and was excited to join and learn more about the actual methods to address AI risks, and how to think scientifically generally.
However after using for a while, I am a bit disappointed. I realized I probably had to filter many things here.
Good:
There are good discussions and previous summaries that are actually useful on alignment. There are people who work on these things from both ngos and industry showing what research they are doing, or what actions they have taken on safety. Similarly with bioweapons etc.
I also like articles like trying to identify how to find what’s the best thing to do with intersections of passion, skill, and importance.
I like the articles that mention/promotes contacting reality.
Bad:
Sometimes I feel the atmosphere is “edgy”, and sometimes I see people may argue over relatively small things that I don’t know how the conclusion will lead to actual actions. And maybe this is just the culture here, but I found it surprising how easy people call each other “wrong” although many times I felt like both sides are just opinions. And I felt like I see less “I think” or “in my opinion” to quality claims than usual workplace at least. People appear to be very confident or sure about their own belief when communicating. From my understanding, I think people may practicing the “strong opinion weakly hold” thinking they could say something strong and change easily—I found that to be easier in verbal communication among colleagues, schoolmates or friends where one can talk to them (a relatively small group) every day. But on a platform where there are a lot more people, and tracking on opinions changes is hard, it might be more productive to consider modifying the “strong” opinion part and quality more in the first place.
I do think the downvote or upvote, which is related to how much you can comment or contribute to the site (or if you can block a person or not), encourage group think and you would need to be identify with the sentiment of the majority (I think another answer mentioned group think as well).
I am feeling many articles or comments are quite personal/non-professional (communication feels different from what I encounter at work), which makes this community a bit confusing mixing personal and professional opinions/sharing. I think it would be nice to have a professional section, and a separate personal sections, and also encourage different communication rules for more communication efficiency, and I guess could naturally filter some articles for people at certain times wanting to focus on different things. Could be good to organize articles better by section as well? Though there is “tags” currently.
This is a personal belief, but I am a bit biased for action and hope to see more discussions on how to execute things, or at least how should actions change based on a proposed belief change.
This might be something more fundamental that is based on personal belief vs (some but not everyone on) lesswrong belief—to a certain extend I appreciate prioritization, but when it is too extreme I feel it is 1) counterproductive on solving issue itself, 2) too extreme that discourages new comers that also want to work on shared issues. It also feels more fear driven rather than rationality driven, which is discrediting in my opinion.
For 1, Many areas to work on sometimes are interrelated, and focusing only on one may not actually achieve the goal.
For 2, Sometimes it just feels alarming/scary when I see people trying to say “do not let other issues we need to solve get in the way of {AI risks/some particular thing}”.
I am sensing (though I am still kinda new here, so I might not have dig enough through articles) is that we may lack some social science connections/backgrounds and how the world actually works, even when talking about society related things (I forgot what specific articles gave me this feeling, maybe related to AI governance.)
I think for now, I probably will continue using but with many many filters.
I am relatively new to the community, and was excited to join and learn more about the actual methods to address AI risks, and how to think scientifically generally.
However after using for a while, I am a bit disappointed. I realized I probably had to filter many things here.
Good:
There are good discussions and previous summaries that are actually useful on alignment. There are people who work on these things from both ngos and industry showing what research they are doing, or what actions they have taken on safety. Similarly with bioweapons etc.
I also like articles like trying to identify how to find what’s the best thing to do with intersections of passion, skill, and importance.
I like the articles that mention/promotes contacting reality.
Bad:
Sometimes I feel the atmosphere is “edgy”, and sometimes I see people may argue over relatively small things that I don’t know how the conclusion will lead to actual actions. And maybe this is just the culture here, but I found it surprising how easy people call each other “wrong” although many times I felt like both sides are just opinions. And I felt like I see less “I think” or “in my opinion” to quality claims than usual workplace at least. People appear to be very confident or sure about their own belief when communicating. From my understanding, I think people may practicing the “strong opinion weakly hold” thinking they could say something strong and change easily—I found that to be easier in verbal communication among colleagues, schoolmates or friends where one can talk to them (a relatively small group) every day. But on a platform where there are a lot more people, and tracking on opinions changes is hard, it might be more productive to consider modifying the “strong” opinion part and quality more in the first place.
I do think the downvote or upvote, which is related to how much you can comment or contribute to the site (or if you can block a person or not), encourage group think and you would need to be identify with the sentiment of the majority (I think another answer mentioned group think as well).
I am feeling many articles or comments are quite personal/non-professional (communication feels different from what I encounter at work), which makes this community a bit confusing mixing personal and professional opinions/sharing. I think it would be nice to have a professional section, and a separate personal sections, and also encourage different communication rules for more communication efficiency, and I guess could naturally filter some articles for people at certain times wanting to focus on different things. Could be good to organize articles better by section as well? Though there is “tags” currently.
This is a personal belief, but I am a bit biased for action and hope to see more discussions on how to execute things, or at least how should actions change based on a proposed belief change.
This might be something more fundamental that is based on personal belief vs (some but not everyone on) lesswrong belief—to a certain extend I appreciate prioritization, but when it is too extreme I feel it is 1) counterproductive on solving issue itself, 2) too extreme that discourages new comers that also want to work on shared issues. It also feels more fear driven rather than rationality driven, which is discrediting in my opinion.
For 1, Many areas to work on sometimes are interrelated, and focusing only on one may not actually achieve the goal.
For 2, Sometimes it just feels alarming/scary when I see people trying to say “do not let other issues we need to solve get in the way of {AI risks/some particular thing}”.
I am sensing (though I am still kinda new here, so I might not have dig enough through articles) is that we may lack some social science connections/backgrounds and how the world actually works, even when talking about society related things (I forgot what specific articles gave me this feeling, maybe related to AI governance.)
I think for now, I probably will continue using but with many many filters.