Well, yeah, but it also gives a lot of answers and provides an argument to everything. Also it feels like reading from the community reduces everything to either effective altruism or AI. It may not be true but I’ve internalized it so much that now I can’t listen to, say, any politician’s statement on state budget, without EA or AI funding immediately coming to mind. Or even any economics or politics or important decision making without feeling like “this is all wrong and we shouldn’t care about it”. It’s a little disheartening :)
Or even any economics or politics or important decision making without feeling like “this is all wrong and we shouldn’t care about it”.
Most people aren’t happy when they read political news. If you care about happiness then spending less time reading political news might be good regardles whether you are in this community or not.
In general “this is all wrong” might not be the best trigger from a rationality perspective. Asking “What’s the likelihood that this is right?” is better from a rationality perspective. Thinking about whether that probability is 20% or 30% can be much more interesting intellectually than thinking in right/wrong terms.
Well, yeah, but it also gives a lot of answers and provides an argument to everything. Also it feels like reading from the community reduces everything to either effective altruism or AI. It may not be true but I’ve internalized it so much that now I can’t listen to, say, any politician’s statement on state budget, without EA or AI funding immediately coming to mind. Or even any economics or politics or important decision making without feeling like “this is all wrong and we shouldn’t care about it”. It’s a little disheartening :)
Most people aren’t happy when they read political news. If you care about happiness then spending less time reading political news might be good regardles whether you are in this community or not.
Secondly that reaction isn’t the result of LW ideology. Eliezer wrote http://lesswrong.com/lw/gz/policy_debates_should_not_appear_onesided/ and having a credence for beliefs instead of categorising them as right/wrong are pillars of LW.
In general “this is all wrong” might not be the best trigger from a rationality perspective. Asking “What’s the likelihood that this is right?” is better from a rationality perspective. Thinking about whether that probability is 20% or 30% can be much more interesting intellectually than thinking in right/wrong terms.