Unpack “trustworthy”—does this mean the person isn’t going to tell falsehoods, but may not actually understand how truth works? Or is this more like Omega—has special access to data?
But otherwise, the person has non-exceptional access to and discernment of truth? So it’s likely that anything truly unusual he believes is wrong. I don’t think Bayes will let me update all that far from “whatever he says is filtered through an engine not optimized for truth.” Anything that he thinks will “radically alter my worldview” is likely an illusion or something I already have some evidence for.
This changes in cases where I think the person DOES have better-than-average access to truth.
Also, the fact that he’s offering to sell me information that will change my worldview very much works against my likelihood to believe what he says.
Yeah, I tend to do that. However, this is the first that you’ve asserted that it’s true information, which is an important clarification. I’m willing to pay a significant amount for true information that will let me make a large update (which is how I interpret “radically alter worldview”).
Unpack “trustworthy”—does this mean the person isn’t going to tell falsehoods, but may not actually understand how truth works? Or is this more like Omega—has special access to data?
The person doesn’t tell lies and you trust his/her intelligence and access to information.
But otherwise, the person has non-exceptional access to and discernment of truth? So it’s likely that anything truly unusual he believes is wrong. I don’t think Bayes will let me update all that far from “whatever he says is filtered through an engine not optimized for truth.” Anything that he thinks will “radically alter my worldview” is likely an illusion or something I already have some evidence for.
This changes in cases where I think the person DOES have better-than-average access to truth.
Also, the fact that he’s offering to sell me information that will change my worldview very much works against my likelihood to believe what he says.
You are fighting the hypothetical.
A person has true information that will “radically alter your worldview”. Assume you believe him/her. How much would you pay for the information?
Seems more like trying to clarify the hypothetical. There’s a genuine dependency here.
Yeah, I tend to do that. However, this is the first that you’ve asserted that it’s true information, which is an important clarification. I’m willing to pay a significant amount for true information that will let me make a large update (which is how I interpret “radically alter worldview”).
I read it as a person who generally has a good track record and who build a reputation with being right when he makes these kind of claims.
Maybe someone who already has done this intervention a few times and who uses the principles of http://lesswrong.com/r/discussion/lw/oe0/predictionbased_medicine_pbm/ and can tell you that with 90% credence you will afterwards say that he radically changed your mind.