Here’s another interesting and potentially useful tool I found recently. I’m not sure if this qualifies as a debate tool, but it seems like it’s in the general category of what we’re looking for:
Summary: Canonizer.com is a wiki system with added camp and survey capabilities. The system provides a rigorous way to measure scientific / moral expert consensus. It is designed for collaborative development of concise descriptions of various competing scientific or moral theories, and the best arguments for such. People can join the camps representing such, giving a quantitative survey or measure of consensus compared to all others. Proposed changes to supported camps go into a review mode for one week. Any supporters of a camp can object to any such proposed change during this time. If it survives a week with no objection, it goes live, guaranteeing unanimous agreement to such changes to the petition by all current signers. If anyone does object, the camp can be forked (taking all supporters of the ‘improvement’), or the info can be included in a sporting sub camp.
The karma or ‘canonization’ system enables the readers to select any algorithm they wish on the side bar to ‘find the good stuff’. For example, you can compare the mind expert scientific consensus with the default general population consensus. Each camp has a forum to discuss and debate further improvements for camps. The general idea is to debate things in the forums, or elsewhere, and summarize everyone’s final / current / state of the art view in the camp statements. A history of everything is maintained, providing a dynamic quantitative measure of how well accepted any theory is, as ever more theory falsifying scientific data / new arguments… come in.
Here’s another interesting and potentially useful tool I found recently. I’m not sure if this qualifies as a debate tool, but it seems like it’s in the general category of what we’re looking for:
Summary: Canonizer.com is a wiki system with added camp and survey capabilities. The system provides a rigorous way to measure scientific / moral expert consensus. It is designed for collaborative development of concise descriptions of various competing scientific or moral theories, and the best arguments for such. People can join the camps representing such, giving a quantitative survey or measure of consensus compared to all others. Proposed changes to supported camps go into a review mode for one week. Any supporters of a camp can object to any such proposed change during this time. If it survives a week with no objection, it goes live, guaranteeing unanimous agreement to such changes to the petition by all current signers. If anyone does object, the camp can be forked (taking all supporters of the ‘improvement’), or the info can be included in a sporting sub camp.
The karma or ‘canonization’ system enables the readers to select any algorithm they wish on the side bar to ‘find the good stuff’. For example, you can compare the mind expert scientific consensus with the default general population consensus. Each camp has a forum to discuss and debate further improvements for camps. The general idea is to debate things in the forums, or elsewhere, and summarize everyone’s final / current / state of the art view in the camp statements. A history of everything is maintained, providing a dynamic quantitative measure of how well accepted any theory is, as ever more theory falsifying scientific data / new arguments… come in.
A priori, it seems likely that would lead to Green vs. Blue behavior: “Go mind experts!”