The result is typically that LW can recite lots of rationalist principles, but when it comes to applying them to a significant number of real-world problems, LW is clueless.
They got the SIAI funded.
Person1: X is a true fact about the world
Person2: But saying X is mean
The genome of the Ebola virus is a true fact about organisms. Yet it is dumb to state it on a microbiology forum. Besides, “if you don’t agree you are dumb” is a statement that has to be backed by exceptional amounts of evidence. People who already disagree can only be convinced by evidence, if they are not intelligent enough to grasp the arguments.
In the field of security engineering, a persistent flat-earth belief is ‘security by obscurity’: the doctrine that security measures should not be disclosed or even discussed.
In the seventeenth century, when Bishop Wilkins wrote the first book on cryptography in English in 1641, he felt the need to justify himself: “If all those useful Inventions that are liable to abuse, should therefore be concealed, there is not any Art or Science which might be lawfully profest”. In the nineteenth century, locksmiths objected to the publication of books on their craft; although villains already knew which locks were easy to pick, the locksmiths’ customers mostly didn’t. In the 1970s, the NSA tried to block academic research in cryptography; in the 1990s, big software firms tried to claim that proprietary software is more secure than its open-source competitors.
Yet we actually have some hard science on this. In the standard reliability growth model, it is a theorem that opening up a system helps attackers and defenders equally; there’s an empirical question whether the assumptions of this model apply to a given system, and if they don’t then there’s a further empirical question of whether open or closed is better.
Indeed, in systems software the evidence supports the view that open is better. Yet the security-industrial complex continues to use the obscurity argument to prevent scrutiny of the systems it sells. Governments are even worse: many of them would still prefer that risk management be a matter of doctrine rather than of science.”
Let me clarify my last comment. It is really all about what we want. We just have to accept that Less Wrong is not only about refining rationality. Less Wrong also won’t be able to refine rationality if it allows the discussion of some topics in great detail, as they risk the future of this platform. So every statement here has to be taken with a grain of salt and to be put and understood in a larger context. Proclaiming the truth might be rational if you value rationality in and of itself. But since rationality is about winning you have to ask for what constitutes winning. The answer to this question is ultimately ideological and about matters of taste.
Again, you are being logically rude. I refuted (I think) the idea that “”if you don’t agree you are dumb” is a statement that has to be backed by exceptional amounts of evidence.”. Don’t switch the goalposts mid-debate. Admit that, in fact, there are some statements such that if you disagree with them, you are dumb, no massive dossier of evidence required.
So what is it that you are trying to argue which I evade? I don’t think that you can generalize from the example of avoiding to signal the intellectual superiority of LW to the general issue of political correctness. Some factual statements are simply bad arguments to use in a debate.
I’m not being logically rude, I’m just trying to argue that political correctness and epistemological issues are not necessarily mutually exclusive. Further, if you want to output a plan for action you better tweak it for real world use, which naturally must include some signaling. Only afterwards one is able to tackle the more fundamental issues of the general rationality of political correctness, e.g. overcoming human nature.
I refuted (I think) the idea that “”if you don’t agree you are dumb” is a statement that has to be backed by exceptional amounts of evidence.”.
I do not think that you have refuted it. I also believed that part of your argument was to assert that we sometimes shouldn’t keep quiet about the truth, whatever the consequences. I do not agree with that either.
Telling people they are dumb means that you are sufficiently sure that 1.) you are right 2.) they are wrong and not just more demanding (more evidence, different kinds of evidence etc.) 3.) the reason for that they disagree is that they are intellectually inferior. Further, even if you are sure someone is dumb, it is still a really bad argument as it is not persuasive. If someone is dumb you have to be even smarter to convince that person. If you just proclaim someone is dumb, maybe you are not as smart as you thought either.
Some people don’t know that they are alive. Does that mean that they are dumb? Eliezer Yudkowsky might be able to rationalize such a disorder because of all his background knowledge. But would he be able to do so if he grew up without being able to acquire his current set of skills? A lot of one’s potential intelligence is unleashed due to certain environmental circumstances, e.g. an advanced education. There are indeed people who do possess less potential. Yet if we want to make them aware of their shortcomings it is not rational to do so by telling them they are dumb but rather telling them to try to estimate their intelligence objectively. There are other, more effective ways to communicate the truth than proclaiming the conclusion.
2+2=4 if you don’t agree you are dumb.
My calculator agrees that 2+2=4, so? If someone does challenge your beliefs, it does not mean that the person is dumb but that maybe you accepted something as given that might be less obvious than you think. The complete proof of 2 + 2 = 4 involves 2,452 subtheorems in a total of 25,933 steps.
Yes, but if we are talking about real world problems then we have to deal with people who are dumb and sometimes we also have to convince them to get what we want. It is rational to limit the truth output of a forum of truth-seekers. An analogy would be the intolerance of intolerance. To maximize tolerance you have to be intolerant of intolerance. This is also the case with rationality as you won’t be able to make the world a more rational place by telling the irrational folks the truth, namely that they are irrational, that would just result in more irrational behavior.
They got the SIAI funded.
The genome of the Ebola virus is a true fact about organisms. Yet it is dumb to state it on a microbiology forum. Besides, “if you don’t agree you are dumb” is a statement that has to be backed by exceptional amounts of evidence. People who already disagree can only be convinced by evidence, if they are not intelligent enough to grasp the arguments.
There are cases where data or ideas can be really hazardous. I don’t count “but it might hurt somebody’s precious feelings” as one of those cases.
I just came across this:
This seems to be neither here nor there as regards the present debate.
I assign some probability to security in obscurity working for bio, some to it not working.
2+2=4 if you don’t agree you are dumb.
Let me clarify my last comment. It is really all about what we want. We just have to accept that Less Wrong is not only about refining rationality. Less Wrong also won’t be able to refine rationality if it allows the discussion of some topics in great detail, as they risk the future of this platform. So every statement here has to be taken with a grain of salt and to be put and understood in a larger context. Proclaiming the truth might be rational if you value rationality in and of itself. But since rationality is about winning you have to ask for what constitutes winning. The answer to this question is ultimately ideological and about matters of taste.
Again, you are being logically rude. I refuted (I think) the idea that “”if you don’t agree you are dumb” is a statement that has to be backed by exceptional amounts of evidence.”. Don’t switch the goalposts mid-debate. Admit that, in fact, there are some statements such that if you disagree with them, you are dumb, no massive dossier of evidence required.
So what is it that you are trying to argue which I evade? I don’t think that you can generalize from the example of avoiding to signal the intellectual superiority of LW to the general issue of political correctness. Some factual statements are simply bad arguments to use in a debate.
I’m not being logically rude, I’m just trying to argue that political correctness and epistemological issues are not necessarily mutually exclusive. Further, if you want to output a plan for action you better tweak it for real world use, which naturally must include some signaling. Only afterwards one is able to tackle the more fundamental issues of the general rationality of political correctness, e.g. overcoming human nature.
I do not think that you have refuted it. I also believed that part of your argument was to assert that we sometimes shouldn’t keep quiet about the truth, whatever the consequences. I do not agree with that either.
Telling people they are dumb means that you are sufficiently sure that 1.) you are right 2.) they are wrong and not just more demanding (more evidence, different kinds of evidence etc.) 3.) the reason for that they disagree is that they are intellectually inferior. Further, even if you are sure someone is dumb, it is still a really bad argument as it is not persuasive. If someone is dumb you have to be even smarter to convince that person. If you just proclaim someone is dumb, maybe you are not as smart as you thought either.
Some people don’t know that they are alive. Does that mean that they are dumb? Eliezer Yudkowsky might be able to rationalize such a disorder because of all his background knowledge. But would he be able to do so if he grew up without being able to acquire his current set of skills? A lot of one’s potential intelligence is unleashed due to certain environmental circumstances, e.g. an advanced education. There are indeed people who do possess less potential. Yet if we want to make them aware of their shortcomings it is not rational to do so by telling them they are dumb but rather telling them to try to estimate their intelligence objectively. There are other, more effective ways to communicate the truth than proclaiming the conclusion.
My calculator agrees that 2+2=4, so? If someone does challenge your beliefs, it does not mean that the person is dumb but that maybe you accepted something as given that might be less obvious than you think. The complete proof of 2 + 2 = 4 involves 2,452 subtheorems in a total of 25,933 steps.
Yes, but if we are talking about real world problems then we have to deal with people who are dumb and sometimes we also have to convince them to get what we want. It is rational to limit the truth output of a forum of truth-seekers. An analogy would be the intolerance of intolerance. To maximize tolerance you have to be intolerant of intolerance. This is also the case with rationality as you won’t be able to make the world a more rational place by telling the irrational folks the truth, namely that they are irrational, that would just result in more irrational behavior.
You are being logically rude. Please don’t!