I’m not sure that’s a help for biased voting patterns (which would probably come from the views being expressed), but it might help preventing local mind-killing from spilling out onto the rest of the site.
But I don’t think there’s an easy mechanism for that, and comments will still show up in ‘recent comments’ under discussion.
If your forum has a lot of smart people, and they read the recommended readings, then the more people who participate in the forum, the smarter the forum will be. If the forum doesn’t have a lot of stupid, belligerent rules that make participation difficult, then it will attract people who like to post. If those people aren’t discouraged from posting, but are discouraged from posting stupid things, your forum will trend toward intelligence (law of large numbers, emergence with many brains) and away from being an echo chamber (law of small numbers, emergence with few brains).
I wouldn’t stay up late at night worrying about how to get people to up-vote or down-vote things. They won’t listen anyway, but even so, they might contain a significant amount of the wisdom found in “the Sequences,” and wisdom from other places, too. They might even contain wisdom from the personal experiences of people on the blue and green teams, who then can contribute to the experiential wisdom of the Lesswrong crowd, even without being philosophically-aware participants, and even with their comments being disdained and down-voted.
If your forum has a lot of smart people, and they read the recommended readings, then the more people who participate in the forum, the smarter the forum will be.
If the forum can be said to have an intelligence which is equal to the sum of its parts, or even just some additive function of its parts, then yes. But this is not reliably the case; agents within a group can produce antagonistic effects on each others’ output, leading to the group collectively being “dumber” than its individual members.
If those people aren’t discouraged from posting, but are discouraged from posting stupid things, your forum will trend toward intelligence (law of large numbers, emergence with many brains) and away from being an echo chamber (law of small numbers, emergence with few brains).
This is true in much the same sense that it’s true that you can effectively govern a country by encouraging the populace to contribute to social institutions and discouraging antisocial behavior. It might be true in a theoretical sense, but it’s too vague to be meaningful as a prescription let alone useful, and a system which implements those goals perfectly may not even be possible.
I’m not sure that’s a help for biased voting patterns (which would probably come from the views being expressed), but it might help preventing local mind-killing from spilling out onto the rest of the site.
But I don’t think there’s an easy mechanism for that, and comments will still show up in ‘recent comments’ under discussion.
If your forum has a lot of smart people, and they read the recommended readings, then the more people who participate in the forum, the smarter the forum will be. If the forum doesn’t have a lot of stupid, belligerent rules that make participation difficult, then it will attract people who like to post. If those people aren’t discouraged from posting, but are discouraged from posting stupid things, your forum will trend toward intelligence (law of large numbers, emergence with many brains) and away from being an echo chamber (law of small numbers, emergence with few brains).
I wouldn’t stay up late at night worrying about how to get people to up-vote or down-vote things. They won’t listen anyway, but even so, they might contain a significant amount of the wisdom found in “the Sequences,” and wisdom from other places, too. They might even contain wisdom from the personal experiences of people on the blue and green teams, who then can contribute to the experiential wisdom of the Lesswrong crowd, even without being philosophically-aware participants, and even with their comments being disdained and down-voted.
If the forum can be said to have an intelligence which is equal to the sum of its parts, or even just some additive function of its parts, then yes. But this is not reliably the case; agents within a group can produce antagonistic effects on each others’ output, leading to the group collectively being “dumber” than its individual members.
This is true in much the same sense that it’s true that you can effectively govern a country by encouraging the populace to contribute to social institutions and discouraging antisocial behavior. It might be true in a theoretical sense, but it’s too vague to be meaningful as a prescription let alone useful, and a system which implements those goals perfectly may not even be possible.