This is like claiming that because a coin came up heads twenty times and tails ten times it is 2x more likely to come up heads this time.
If you don’t assume that the coin is fair, then certainly a coin coming up heads twenty times and tails ten times is evidence in favor of it being more likely to come up heads next time, because it’s evidence that it’s weighted so that it favours heads.
Similarly if a person is weighted so that they favor truth, their claims are evidence in favour of that truth.
Absent some other reason to justify the correlation between your friend’s accuracy and the current instance, such beliefs are invalid.
Beliefs like trusting the trustworthy and not trustring the untrustworthy, whether you consider them “valid” beliefs or not, are likely to lead one to make correct predictions about the state of the world. So such beliefs are valid in the only way that matters for epistemic and instrumental rationality both.
If you don’t assume that the coin is fair, then certainly a coin coming up heads twenty times and tails ten times is evidence in favor of it being more likely to come up heads next time, because it’s evidence that it’s weighted so that it favours heads.
Or you could make a direct observation, (such as by weighing it with a fine tool, or placing it on a balancing tool) and know.
Similarly if a person is weighted so that they favor truth, their claims are evidence in favour of that truth.
Not unless they have an ability to provide their justification for a given instantiation. It would be sufficient for trusting them if you are not concerned with what is true as opposed to what is “likely true”. There’s a difference between these, categorically: one is an affirmation—the other is a belief.
So such beliefs are valid in the only way that matters for epistemic and instrumental rationality both.
Incorrect. And we are now as far as this conversation is going to go. You hold to Bayesian rationality as axiomatically true of rationality. I do not.
Or you could make a direct observation, (such as by weighing it with a fine tool, or placing it on a balancing tool) and know.
And in the absence of the ability to make direct observations? If there are two eye-witness testimonies to a crime, and one of the eye-witnesses is a notorious liar with every incentive to lie, and one of them is famous for his honesty and has no incentive to lie—which way would you have your judgment lean?
SPOCK: “If I let go of a hammer on a planet that has a positive gravity, I need not see it fall to know that it has in fact fallen. [...] Gentlemen, human beings have characteristics just as inanimate objects do. It is impossible for Captain Kirk to act out of panic or malice. It is not his nature.”
I very much like this quote, because it was one of the first times when I saw determinism, in the sense of predictability, being ennobling.
If there are two eye-witness testimonies to a crime
I have already stated that witness testimonials are valid for weighting beliefs. In the somewhere-parent topic of authorities; this is the equivalent of referencing the work of an authority on a topic.
If you don’t assume that the coin is fair, then certainly a coin coming up heads twenty times and tails ten times is evidence in favor of it being more likely to come up heads next time, because it’s evidence that it’s weighted so that it favours heads.
Similarly if a person is weighted so that they favor truth, their claims are evidence in favour of that truth.
Beliefs like trusting the trustworthy and not trustring the untrustworthy, whether you consider them “valid” beliefs or not, are likely to lead one to make correct predictions about the state of the world. So such beliefs are valid in the only way that matters for epistemic and instrumental rationality both.
Or you could make a direct observation, (such as by weighing it with a fine tool, or placing it on a balancing tool) and know.
Not unless they have an ability to provide their justification for a given instantiation. It would be sufficient for trusting them if you are not concerned with what is true as opposed to what is “likely true”. There’s a difference between these, categorically: one is an affirmation—the other is a belief.
Incorrect. And we are now as far as this conversation is going to go. You hold to Bayesian rationality as axiomatically true of rationality. I do not.
And in the absence of the ability to make direct observations? If there are two eye-witness testimonies to a crime, and one of the eye-witnesses is a notorious liar with every incentive to lie, and one of them is famous for his honesty and has no incentive to lie—which way would you have your judgment lean?
SPOCK: “If I let go of a hammer on a planet that has a positive gravity, I need not see it fall to know that it has in fact fallen. [...] Gentlemen, human beings have characteristics just as inanimate objects do. It is impossible for Captain Kirk to act out of panic or malice. It is not his nature.”
I very much like this quote, because it was one of the first times when I saw determinism, in the sense of predictability, being ennobling.
I have already stated that witness testimonials are valid for weighting beliefs. In the somewhere-parent topic of authorities; this is the equivalent of referencing the work of an authority on a topic.