A critically-thinking skeptic can deduce the truth in both cases, but that doesn’t make the cases anywhere close to equivalent. Accidental falsehoods shouldn’t engender nearly the same degree of distrust that deliberate falsehoods should, and teaching anybody (a child being absolutely no exception) to not trust anybody is impractical for both you and the student. There are degrees of trust. Learning to recognize lies is important for a different reason than learning to recognize mistakes is important. You aren’t always going to be able to determine the correct answer by critical thinking alone; personal reputation and recognition of an agenda also play a role.
I never even came close to stating that “the people we trust are always right”. You appear to be viewing the word “trust” entirely too much as a binary state. Tabooing that word for now...
Since we can’t always spend the time and effort to verify a claim, it’s important for people (and thus important to teach children) to be able to quickly assign a probability to the likelihood that a person is correct when they say something. There are a number of factors that can go into such a calculation, and they will differ based on the statement and the speaker. A person who has a known history of trying to deceive others should be assigned a lower prior probability of correctness than a person who has not shown such a history. A person who makes a large number of honest mistakes should also be assigned a lower probability than somebody who doesn’t. However, that is where the similarities end.
An ardent young-earth creationist’s views on geology should probably be assigned a very low prior probability of correctness, but if that same Y.E.C. has a PhD in economics, their views on something like inflation should probably be given a higher probability of correctness than those of most people. When you know the speaker has a tendency towards non-malicious incorrectness in a given area, you can use that information to discount their beliefs in that area without writing off everything the person says in all areas. You should still be skeptical of anything they say that seems unlikely, and you should expend the effort on verifying the claim that is appropriate to your live and the value of your time and effort given the probability that they are correct (taking into account things like how well peer-reviewed the position is, whether it contradicts common sense, etc., but also considering how well the person can be expected to know the field and whether they have any known reason to deceive people about it). All else being equal, there’s no reason I know of to have a greater expectation that a Y.E.C. is incorrect about inflation than somebody who is not a Y.E.C.
For a person who has a known history of intentional deceit, it makes sense to use a lower prior probability of correctness for everything they say. A politician’s promises are an excellent example; without going into any actual political side, I think we can all agree that politicians are far more likely to make false promises and deceptive claims than the average person who is not in (or striving for) a similar position of popular authority. There is reason to assign a lower prior probability of correctness to almost anything a politician says (publicly) than there is for an otherwise-equal non-politician.
Now step back from the broad categories of things like Y.E.C.s and politicians, and consider the people around you in your daily life. Most of them will have no motive to intentionally deceive you, but some will. Many of them will have biases towards incorrect positions in a lot of areas, but it would be inefficient (and socially awkward) to act as though a friend who has a known bias about a sports team as though they’re a pathological liar about non-sports-things just because they’re completely blind to that team’s quality and you’ve caught them in a number of false claims about the team that they should have known were false. On the other hand, some people just are unreliable about things, or think it’s funny to convince people of lies for no purpose but their own amusement, or have developed a reason to want to hurt you personally and will say whatever they think will have that effect. It is important to be able to tell the difference between those people and those who merely sometimes make honest mistakes.
Since we can’t always spend the time and effort to verify a claim, it’s important for people (and thus important to teach children) to be able to quickly assign a probability to the likelihood that a person is correct when they say something.
Well, I guess that all depends.
Do you care about the truth, or is it convenient to except the established authorative answer?
Here’s an example:
An ardent young-earth creationist’s views on geology should probably be assigned a very low prior probability of correctness, but if that same Y.E.C. has a PhD in economics, their views on something like inflation should probably be given a higher probability of correctness than those of most people.
You realize that “inflation” creates most of the large scale structure of the universe is a fraction of a nanosecond?
Let’s get this pefectly straight.
The Earth created in 7 days = BAD
The entire Universe created in a fraction of a nanosecond by unlimited amounts of Dark Energy = GOOD
Given the presence of the word “economics” in the sentence in question, I can’t help thinking you have misunderstood what CBHacking meant by “inflation”.
I realize you meant inflation in the economic sense
Oh, wait. So … you knew what you were saying was nonsense, but the opportunity was just too great because … you think it’s obvious that a leading scientific theory is less credible than young-earth creationism and think it’s important to pretend to laugh at someone saying otherwise?
Er, you do realise (don’t you?) that all that’s saying is that one particular experimental result that some people said was probably evidence for inflation turned out to be ambiguous? And that this leaves the credibility of inflation no worse than before the experiment in question was done?
How much time do we spend coming up with excuses not to critically investigate claims, and at what point do we critically investigate the claims?
That would be a question of “value of information”, which actually I think is a somewhat neglected topic in LW’s collective writings on rationality.
But I get the impression that you’re asking this not as an interesting general question, but because you think some category of claim isn’t being critically investigated as it should be, and that people are coming up with excuses instead of doing so. If so, would you like to say briefly and clearly what claims you think those are and what your reasons are for thinking they’re being avoided?
What type of rationalists aren’t critical?
Dunno. You say that as if there are people declaring proudly that they are non-critical rationalists, but I don’t see that. Again, could you be more explicit?
But I get the impression that you’re asking this not as an interesting general question, but because you think some category of claim isn’t being critically investigated as it should be, and that people are coming up with excuses instead of doing so.
If I drew attention to several claims, and in the rare and unprecedented circumstance you actually agreed with me, then all that would do is prove those several claims needed more critical examination.
My point is we shouldn’t be afraid to critically examine everything.
Any we shouldn’t be afraid of developing a child’s critical thinking skills, and we don’t need to teach them to trust our knoweldge. (If our knowledge is any good, they’ll trust it for their own reasons.)
(If you really want an example, that idea that inflation creates the entire cosmic web in a trillionith of a picosecond is a good start.)
And we shouldn’t be afraid of developing a child’s critical thinking skills
Who, please, is saying that we should be afraid of developing a child’s critical thinking skills, and in what context?
that idea that inflation creates the entire cosmic web in a trillionth of a picosecond
I’ve no problem with examining that critically, but I think this is an exercise best done by professional theoretical physicists, whose current position appears to me to be that it’s probably right. (Though not that it’s necessarily well described by the words you happened to use.) If you disagree with that, would you like to say what you consider stronger evidence against taking inflation seriously than the rough consensus of theoretical physicists is for taking it seriously?
(For clarity: I am not claiming, nor do I believe, that there is anything like unanimity among theoretical physicists that inflation is correct. Neither do I claim it’s definitely correct. The usual position appears to me to be that it gives a description of the early universe that fits a lot of otherwise puzzling observations, but that in the absence of more direct evidence than we seem likely to get any time soon we can’t upgrade it much beyond “plausible and a reasonable working hypothesis”. Is that what you’re objecting to, or are you objecting to some much stronger claim of certainty and if so who’s making that claim?)
If I drew attention to several claims, and in the rare and unprecedented circumstance you actually agreed with me, then all that would do is prove those several claims needed more critical examination.
Do you agree that 2+2=4? So do I. Under that logic, that claim needs more critical examination.
If you really want an example, that idea that inflation creates the entire cosmic web in a trillionith of a picosecond is a good start.
This is the second time you’ve criticized inflation. What is your objection to inflation other than that it doesn’t fit your intuition? Human intuition works very well on the medium scale, not so much on the very small or very large scales.
Because the resolution in both cases is the same: a critical thinker.
A critically-thinking skeptic can deduce the truth in both cases, but that doesn’t make the cases anywhere close to equivalent. Accidental falsehoods shouldn’t engender nearly the same degree of distrust that deliberate falsehoods should, and teaching anybody (a child being absolutely no exception) to not trust anybody is impractical for both you and the student. There are degrees of trust. Learning to recognize lies is important for a different reason than learning to recognize mistakes is important. You aren’t always going to be able to determine the correct answer by critical thinking alone; personal reputation and recognition of an agenda also play a role.
I’m not sure what trust has to do with this.
Are you saying that people we trust are always right?
That critical thinking isn’t necessary for kids? That they should just trust we are right?
You say ” don’t lie to them about verifiable facts”.
Are verifiable facts the same as truth? No more, no less?
Can all truth be stated in a logically sound manner that is backed up by verifiable facts?
Doesn’t that sound more like empiricism than rationalism?
I never even came close to stating that “the people we trust are always right”. You appear to be viewing the word “trust” entirely too much as a binary state. Tabooing that word for now...
Since we can’t always spend the time and effort to verify a claim, it’s important for people (and thus important to teach children) to be able to quickly assign a probability to the likelihood that a person is correct when they say something. There are a number of factors that can go into such a calculation, and they will differ based on the statement and the speaker. A person who has a known history of trying to deceive others should be assigned a lower prior probability of correctness than a person who has not shown such a history. A person who makes a large number of honest mistakes should also be assigned a lower probability than somebody who doesn’t. However, that is where the similarities end.
An ardent young-earth creationist’s views on geology should probably be assigned a very low prior probability of correctness, but if that same Y.E.C. has a PhD in economics, their views on something like inflation should probably be given a higher probability of correctness than those of most people. When you know the speaker has a tendency towards non-malicious incorrectness in a given area, you can use that information to discount their beliefs in that area without writing off everything the person says in all areas. You should still be skeptical of anything they say that seems unlikely, and you should expend the effort on verifying the claim that is appropriate to your live and the value of your time and effort given the probability that they are correct (taking into account things like how well peer-reviewed the position is, whether it contradicts common sense, etc., but also considering how well the person can be expected to know the field and whether they have any known reason to deceive people about it). All else being equal, there’s no reason I know of to have a greater expectation that a Y.E.C. is incorrect about inflation than somebody who is not a Y.E.C.
For a person who has a known history of intentional deceit, it makes sense to use a lower prior probability of correctness for everything they say. A politician’s promises are an excellent example; without going into any actual political side, I think we can all agree that politicians are far more likely to make false promises and deceptive claims than the average person who is not in (or striving for) a similar position of popular authority. There is reason to assign a lower prior probability of correctness to almost anything a politician says (publicly) than there is for an otherwise-equal non-politician.
Now step back from the broad categories of things like Y.E.C.s and politicians, and consider the people around you in your daily life. Most of them will have no motive to intentionally deceive you, but some will. Many of them will have biases towards incorrect positions in a lot of areas, but it would be inefficient (and socially awkward) to act as though a friend who has a known bias about a sports team as though they’re a pathological liar about non-sports-things just because they’re completely blind to that team’s quality and you’ve caught them in a number of false claims about the team that they should have known were false. On the other hand, some people just are unreliable about things, or think it’s funny to convince people of lies for no purpose but their own amusement, or have developed a reason to want to hurt you personally and will say whatever they think will have that effect. It is important to be able to tell the difference between those people and those who merely sometimes make honest mistakes.
Well, I guess that all depends.
Do you care about the truth, or is it convenient to except the established authorative answer?
Here’s an example:
You realize that “inflation” creates most of the large scale structure of the universe is a fraction of a nanosecond?
Let’s get this pefectly straight.
The Earth created in 7 days = BAD
The entire Universe created in a fraction of a nanosecond by unlimited amounts of Dark Energy = GOOD
That’s honestly what you’re selling here?
http://www.nature.com/news/big-bang-blunder-bursts-the-multiverse-bubble-1.15346
(I realize you meant inflation in the economic sense, I just thought the connection was too good.)
Given the presence of the word “economics” in the sentence in question, I can’t help thinking you have misunderstood what CBHacking meant by “inflation”.
Oh, wait. So … you knew what you were saying was nonsense, but the opportunity was just too great because … you think it’s obvious that a leading scientific theory is less credible than young-earth creationism and think it’s important to pretend to laugh at someone saying otherwise?
Blimey.
Er, you do realise (don’t you?) that all that’s saying is that one particular experimental result that some people said was probably evidence for inflation turned out to be ambiguous? And that this leaves the credibility of inflation no worse than before the experiment in question was done?
I wrote the part about YEC and inflation before realizing he meant inflation in another sense.
But I think that just draws attention to the question.
How much time do we spend coming up with excuses not to critically investigate claims, and at what point do we critically investigate the claims?
I consider myself a critical rationalist, ala Karl Popper.
What type of rationalists aren’t critical? Non-critical rationalists? Selectively critical rationalists?
Why select when you are going to think critically and when you are not? Why not think critically all the time?
That would be a question of “value of information”, which actually I think is a somewhat neglected topic in LW’s collective writings on rationality.
But I get the impression that you’re asking this not as an interesting general question, but because you think some category of claim isn’t being critically investigated as it should be, and that people are coming up with excuses instead of doing so. If so, would you like to say briefly and clearly what claims you think those are and what your reasons are for thinking they’re being avoided?
Dunno. You say that as if there are people declaring proudly that they are non-critical rationalists, but I don’t see that. Again, could you be more explicit?
Limited resources.
If I drew attention to several claims, and in the rare and unprecedented circumstance you actually agreed with me, then all that would do is prove those several claims needed more critical examination.
My point is we shouldn’t be afraid to critically examine everything.
Any we shouldn’t be afraid of developing a child’s critical thinking skills, and we don’t need to teach them to trust our knoweldge. (If our knowledge is any good, they’ll trust it for their own reasons.)
(If you really want an example, that idea that inflation creates the entire cosmic web in a trillionith of a picosecond is a good start.)
Who is saying otherwise? (This seems rather like a rhetorical technique Tooby and Cosmides accuse Stephen Jay Gould of using: “But I tell you the sun really does rise in the east”.)
Who, please, is saying that we should be afraid of developing a child’s critical thinking skills, and in what context?
I’ve no problem with examining that critically, but I think this is an exercise best done by professional theoretical physicists, whose current position appears to me to be that it’s probably right. (Though not that it’s necessarily well described by the words you happened to use.) If you disagree with that, would you like to say what you consider stronger evidence against taking inflation seriously than the rough consensus of theoretical physicists is for taking it seriously?
(For clarity: I am not claiming, nor do I believe, that there is anything like unanimity among theoretical physicists that inflation is correct. Neither do I claim it’s definitely correct. The usual position appears to me to be that it gives a description of the early universe that fits a lot of otherwise puzzling observations, but that in the absence of more direct evidence than we seem likely to get any time soon we can’t upgrade it much beyond “plausible and a reasonable working hypothesis”. Is that what you’re objecting to, or are you objecting to some much stronger claim of certainty and if so who’s making that claim?)
Do you agree that 2+2=4? So do I. Under that logic, that claim needs more critical examination.
This is the second time you’ve criticized inflation. What is your objection to inflation other than that it doesn’t fit your intuition? Human intuition works very well on the medium scale, not so much on the very small or very large scales.