So I think the actual terminal goal for something like LW might be “uncover important intellectual truths.” It’s certainly not “say true things” or the site would be served by simply republishing the thesaurus over and over.
I think if you’re judging the impact on that value, then both “freedom of speech” and “not driving people away” begin to trade off against each other in important ways.
I think if you’re judging the impact on that value, then both “freedom of speech” and “not driving people away” begin to trade off against each other in important ways.
Yes, that I agree with, and I’m happy with that framing of it.
I suppose the actual terminal goal is a thing that ought to be clarified and agreed upon. The about page has:
To that end, LessWrong is a place to 1) develop and train rationality, and 2) apply one’s rationality to real-world problems.
But that’s pretty brief, doesn’t explicitly mention truth, and doesn’t distinguish between “uncover important intellectual truths” and “cause all its members to have maximally accurate maps” or something.
Elsewhere, I’ve talked at length about the goal of intellectual progress for LessWrong. That’s also unclear about what specific tradeoffs are implied when pursuing truths.
Important questions, probably the community should discuss them more. (I though my posting a draft of the new about page would spark this discussion, but it didn’t.)
So I think the actual terminal goal for something like LW might be “uncover important intellectual truths.” It’s certainly not “say true things” or the site would be served by simply republishing the thesaurus over and over.
I think if you’re judging the impact on that value, then both “freedom of speech” and “not driving people away” begin to trade off against each other in important ways.
Yes, that I agree with, and I’m happy with that framing of it.
I suppose the actual terminal goal is a thing that ought to be clarified and agreed upon. The about page has:
But that’s pretty brief, doesn’t explicitly mention truth, and doesn’t distinguish between “uncover important intellectual truths” and “cause all its members to have maximally accurate maps” or something.
Elsewhere, I’ve talked at length about the goal of intellectual progress for LessWrong. That’s also unclear about what specific tradeoffs are implied when pursuing truths.
Important questions, probably the community should discuss them more. (I though my posting a draft of the new about page would spark this discussion, but it didn’t.)