Imagine that a completely trustworthy person who knows all your beliefs has acquired information that will “radically alter your worldview.” No further details of the information are given. How much would you pay for it?
[pollid:1198]
Unpack “trustworthy”—does this mean the person isn’t going to tell falsehoods, but may not actually understand how truth works? Or is this more like Omega—has special access to data?
But otherwise, the person has non-exceptional access to and discernment of truth? So it’s likely that anything truly unusual he believes is wrong. I don’t think Bayes will let me update all that far from “whatever he says is filtered through an engine not optimized for truth.” Anything that he thinks will “radically alter my worldview” is likely an illusion or something I already have some evidence for.
This changes in cases where I think the person DOES have better-than-average access to truth.
Also, the fact that he’s offering to sell me information that will change my worldview very much works against my likelihood to believe what he says.
Yeah, I tend to do that. However, this is the first that you’ve asserted that it’s true information, which is an important clarification. I’m willing to pay a significant amount for true information that will let me make a large update (which is how I interpret “radically alter worldview”).
If all the parts of this hold true, then person knows me well enough to know how important it would be to me and to the world to change my worldview. If they’re not already telling me without payment, I can conclude that it wouldn’t have much practical impact and be something like “The Earth is a Simulation but we don’t know anything about how it works beyond physics or who made it, but the proof is convincing.” Given that, I would probably pay a small amount of curiosity but not more.
If I know that what they are saying is true, I will already radically alter my worldview, by dividing up my probability estimate among the alternate possibilities that I think are most likely to be true.
Imagine that a completely trustworthy person who knows all your beliefs has acquired information that will “radically alter your worldview.” No further details of the information are given. How much would you pay for it? [pollid:1198]
Unpack “trustworthy”—does this mean the person isn’t going to tell falsehoods, but may not actually understand how truth works? Or is this more like Omega—has special access to data?
The person doesn’t tell lies and you trust his/her intelligence and access to information.
But otherwise, the person has non-exceptional access to and discernment of truth? So it’s likely that anything truly unusual he believes is wrong. I don’t think Bayes will let me update all that far from “whatever he says is filtered through an engine not optimized for truth.” Anything that he thinks will “radically alter my worldview” is likely an illusion or something I already have some evidence for.
This changes in cases where I think the person DOES have better-than-average access to truth.
Also, the fact that he’s offering to sell me information that will change my worldview very much works against my likelihood to believe what he says.
You are fighting the hypothetical.
A person has true information that will “radically alter your worldview”. Assume you believe him/her. How much would you pay for the information?
Seems more like trying to clarify the hypothetical. There’s a genuine dependency here.
Yeah, I tend to do that. However, this is the first that you’ve asserted that it’s true information, which is an important clarification. I’m willing to pay a significant amount for true information that will let me make a large update (which is how I interpret “radically alter worldview”).
I read it as a person who generally has a good track record and who build a reputation with being right when he makes these kind of claims.
Maybe someone who already has done this intervention a few times and who uses the principles of http://lesswrong.com/r/discussion/lw/oe0/predictionbased_medicine_pbm/ and can tell you that with 90% credence you will afterwards say that he radically changed your mind.
If all the parts of this hold true, then person knows me well enough to know how important it would be to me and to the world to change my worldview. If they’re not already telling me without payment, I can conclude that it wouldn’t have much practical impact and be something like “The Earth is a Simulation but we don’t know anything about how it works beyond physics or who made it, but the proof is convincing.” Given that, I would probably pay a small amount of curiosity but not more.
Sharing the information might have a cost for the other person that lead to it not being shared without payment.
There’s also the element that you take information a lot more seriously when you paid money for it.
If I know that what they are saying is true, I will already radically alter my worldview, by dividing up my probability estimate among the alternate possibilities that I think are most likely to be true.