So you probably won’t convince me that these people know what the claim is, but you haven’t even attempted to convince me that you know what the claim is. Do you see that I asked multiple questions?
Do you see how giving very specific answers to this question would be the same as stating people’s names?
Suffice it to say that I understand the difference between impressiveness and calibration, and it didn’t seem like they did before our conversation, even though they are smart.
I have heard this from very smart people.
Could you give an example?
Could you give an example where the claim is that 50% predictions are less meaningful than 10% predictions?
How do you know that it is about accuracy?
I don’t really want to point to specific people. I can think of a couple of conversations with smart EAs or Rationalists where this claim was made.
So you probably won’t convince me that these people know what the claim is, but you haven’t even attempted to convince me that you know what the claim is. Do you see that I asked multiple questions?
Do you see how giving very specific answers to this question would be the same as stating people’s names?
Suffice it to say that I understand the difference between impressiveness and calibration, and it didn’t seem like they did before our conversation, even though they are smart.
Well, that’s something, but I don’t see how it’s relevant to this thread.
I mean, these things? A very similar claim to “10% are less meaningful than 50%” which was due to conflating impressiveness and calibration.
It may be that we’re just talking past each other?
Yes, exactly: this post conflates accuracy and calibration. Thus it is a poor antidote to people who make that mistake.
I do think we’re talking past each other now as I don’t know how this relates to our previous discussion.
At any rate I don’t think the discussion is that high value to the rest of the post so I think I’ll just leave it here.