Software tools for community truth-seeking
In reply to: Community Epistemic Practice
There are software tools, possibly helpful for community truth-seeking. For example, truthmapping.com is described very well here. Also, debategraph.org, and I’m sure there are others.
- Debate tools: an experience report by 5 Feb 2010 14:47 UTC; 51 points) (
- 2 Sep 2010 12:32 UTC; 5 points) 's comment on Less Wrong: Open Thread, September 2010 by (
- 2 Sep 2010 12:21 UTC; 0 points) 's comment on Less Wrong: Open Thread, September 2010 by (
The Truthmapping site encourages people to chop their arguments up into lots of little pieces. The problem with that is that if you take an argument A->B->C, and split it into two separate arguments, you’re likely to end up with arguments A->B and B’->C, where B and B’ look identical but turn out to be different on closer inspection. This is enough of a problem when A->B and B->C are in the same article; separating them will only make it worse.
That’s a great criticism. By allowing the viewer to focus on each small step, these arguments can be checked step-by-step more rigorously. However, by encouraging this small-scale focus, you leave yourself open to inconsistencies that can only be seen at a larger scope.
These are fascinating apps, but I just know Wittgenstein is spinning in his grave.
“Because the claims are natural language text, the structure truthmapper enforces is looser than a syllogism; merely a tree of claims and supporting claims.”
For me, this is the rub: The truthmapper format, which combines the structure of syllogisms with the tumult of online communities and the opacity and weakness of language, invites a kind of cargo cult logic, where things are called premises and conclusions but sound like UN General Assembly resolutions:
“Premise 1: We are all born free and equal in dignity and rights! Premise 2: We are not equal! Conclusion: Revolution!”
These ideas are ambitious, and some progeny of these experiments may turn out to be the next Wikipedia, but you’d have a hard time satisfying me that discourse was being elevated until all the arguments on truthmapper are presented as first-order symbolic logic and all the content of the assertions is written in Lojban. Until then, clear writing and frank, iterated assertions in natural language are probably preferable.
A syllogism is three lines, each containing a quantifier, a subject, and a possibly negated predicate. It is a really rigid form of argument, and not tree-like at all. You may be thinking of a sorites, which is a bunch of syllogisms put together. Tree structured arguments are incredibly common in all kinds of logic, proof theory, and argumentation theory. Leaping from “tree-shaped” to “sorites” is like leaping from “flattish” to “flat-earthers”.
Regardless of my nitpicking, I agree with you: we need progeny of these experiments. I may disagree about the details (predicate logic? lojban?!).
Thank you for the information on syllogisms. I know I was using the term wrong below, and I really should have known better. It may be nitpicking, but I think rationalists more than others are probably interested in making sure they use words correctly.
If you’re familiar with Lojban, I’d be very interested in a post on how you think it would or wouldn’t help with rationality.
TruthMapper scares me, for the same reason Objectivists I used to know who thought they knew a formal deductive proof going from “A is A” to “Taxation is slavery”, justifying each step with an inference rule scared me.
See for example the proof that commercialization of fine art hurts society.
I’m not sure whether TruthMapper encourages people to be sloppy, or whether it’s such a good tool that the sloppiness is just much more obvious than it would be on a message board.
But I’m inclined to lay a bit of the blame on the site itself. For one thing, the video claims that it lets people make all assumptions explicit, which I take to mean that the company behind it believes that. For another, the entire philosophy seems to be that argument should work like an Aristotelian syllogism, and that’s part of the problem. For a third, I can’t take them seriously with that logo. Did they pay the designer per Photoshop layer effect used?
Debategraph looks like a mind map kind of thing. I suppose if that’s the way you like seeing your information organized, it could be useful. I’m just wary of the whole concept of formalizing debate too much (by formal, I mean formal as in official, not formal as in formal systems). Once you start thinking like a high school kid at Debate Club, you’ve already lost, and I worry these sites could encourage that mode.
The idea of truth-seeking software is a good one, but there’s got to be a way to avoid aiming it at the lowest common denominator.
I used to be quite interested in that kind of technology, I had even set up a few experiments on wiki, though they never went that far … I used to argue that those could be a good way of creating information on divisive issues, as an alternative to having both sides set up their own resources and avoid linking to good arguments from the other side.
I guess now I’ve lost interest about those, and don’t think they’re that useful. Someday I’ll have to go back and try all the “high-tech debate” sites that have sprunt up, but I’m more skeptical about the benefit of those kinds of “debate technology”. (one red flag is I don’t feel that inclined to participate in them, at least, much less than I would in forums or blog comments)
I think having publicly edited “chains of reasoning” could be interesting, because they could help show someone where others might disagree with his logic. Like, if the objectivists you mention had their formal proofs laid out for public criticism, they’d probably be forced to admit that it isn’t as strong as what they thought.
In other words, I don’t think “pyramids of logic” have much value, but these sites might help point out the weaknesses of pyramids of logic to those who rely on them too much (Blaise Pascal, I’m looking at you).
If I am not mistaken, you have several criticisms of truthmapper. I’ve tried to respond to them in a carefully numbered fashion. This separation might be a rough approximation of the way a software tool would structure an argument.
A proof from the sole premise ‘A is A’ concluding ‘Taxation is slavery’ is certainly fallacious, I agree. Can you expand on what the ‘same reason’ is? I’m not sure what I’m expected to see in the argument you reference. It is awkward at the very least, but it is more detailed, concrete and falsifiable than many trollish claims, and some of its flaws are pointed out in the critiques.
The site may encourage people to be sloppy in their argumentation, or it may make sloppiness more obvious.
The video makes a fallacious claim “all assumptions explicit”, and that diminishes my trust of the organization, I agree.
I’m not sure what you mean by “argument should work like an Aristotelian syllogism”. There are many flaws in syllogisms—the one I remember is the inability to prove that a horse’s head is an animal’s head. Because the claims are natural language text, the structure truthmapper enforces is looser than a syllogism; merely a tree of claims and supporting claims.
You’re entirely correct, the logo is not good.
Paying official attention to argumentation may encourage making it a status contest, with individuals striving to “win” rather than striving to discover the truth. This is a thorny problem for rationality, but I don’t think it is confined to argumentation software.
I think we only disagree on 4. (You agree with me on 2,3, and 5, and I agree with you that 6 is not confined to software). I think the expansion of 1 you want really is 4, and I admit I explained 4 poorly. It is kind of tangled in my own head, but maybe I can do better:
TruthMapper encourages people to think that an argument on politics or religion or culture is structured like a deductive proof, where if there’s a problem, it’s because someone accidentally used (A → B) and (B) to conclude (A) or something silly like that. The real problem with all of these arguments is that no one’s grounded their morality properly, people are treating generalizations as universals, people import hidden assumptions, people think proving a single major negative of a disliked theory is enough without running a cost-benefit analysis, people are using words wrongly and so on.
But upon further thought, you’re right that this is the program making a common flaw more obvious, not the program creating the flaw. But if I were to encounter for example the argument about art on a message board, I would try to explain why the whole argument was hopeless because of these points, and how the person’s argument style could become more rigorous. Whereas on TruthMapper, I am reduced to sniping at why Point 4 doesn’t follow from Point 3.
But I’m open to testing the system empirically. I trust the people here to avoid the sort of mistakes the people in the art argument used. If you want to organize a LessWrong debate about something on TruthMapper, I’ll participate and change my mind if the debate goes better than it would on a comment thread here.
Awesome; I think we may have actually communicated.
Despite my posting these things, I don’t really want to organize a LessWrong debate on TruthMapper or DebateGraph. They’re both so clumsy and annoying in user interface that I’d rather wait (or work) for something more pleasant to use.
Good analysis Yvain. I guess TruthMapper could be handy when some reasoning pushed the limits of your working memory, yet the emphasis on debate and persuassiveness, grates almost as much as the ghastly web design.
The first thing I noticed was “What is not fully understood is not possessed.”—Goethe. I’ve got a car, a book on string theory and a spleen that I am quite confident I possess. Annoying.
Would you prefer Feynman’s formulation?
We used Rationale in the philosophy classes at school. It’s an argument mapping tool. Too bad it’s Windows only :-(. I would really like to see a platform independent or online version.
http://rationale.austhink.com/
This comment regarding this groupware may be of interest in the context of this thread.