By “informal” I meant that the belief is not on a prediction market, so you can influence consensus only by talking, without carefully keeping track of transactions. (I disagree with it being appropriate not to care about results in informal communication, so it’s not a distinction I was making.)
What is the value, to whom, of the predictions being correct? The interesting cases are one where there is something performing the function of a prediction market in feeding back some value for correct and surprising predictions. All else is “informal” and mostly about signaling rather than truth.
The value of caring about informal reasoning is in training the same skills that apply for knowably important questions, and in seemingly unimportant details adding up in ways you couldn’t plan for. Existence of a credible consensus lets you use a belief without understanding its origin (i.e. without becoming a world-class expert on it), so doesn’t interact with those skills.
When correct disagreement of your own beliefs with consensus is useful at scale, it eventually shifts the consensus, or else you have a source of infinite value. So almost any method of deriving significant value from private predictions being better than consensus is a method of contributing knowledge to consensus.
(Not sure what you were pointing at, mostly guessing the topic.)
For oneself, caring about reasoning and correct predictions is well worthwhile. And it requires some acknowledgement that your beliefs are private, and that they are separate from your public claims. Forgetting that this applies to others as well as yourself seems a bit strange.
I may be a bit too far on the cynicism scale, but I start with the assumption that informal predictions are both oversimplified to fit the claimant’s model of their audience, and adjusted in direction (from the true belief) to have a bigger impact on their audience.
That is, I think most public predictions are of the form “you should have a higher credence in X than you seem to”, but for greater impact STATED as “you should believe X”.
By “informal” I meant that the belief is not on a prediction market, so you can influence consensus only by talking, without carefully keeping track of transactions. (I disagree with it being appropriate not to care about results in informal communication, so it’s not a distinction I was making.)
exploring here, not sure where it’ll go.
What is the value, to whom, of the predictions being correct? The interesting cases are one where there is something performing the function of a prediction market in feeding back some value for correct and surprising predictions. All else is “informal” and mostly about signaling rather than truth.
The value of caring about informal reasoning is in training the same skills that apply for knowably important questions, and in seemingly unimportant details adding up in ways you couldn’t plan for. Existence of a credible consensus lets you use a belief without understanding its origin (i.e. without becoming a world-class expert on it), so doesn’t interact with those skills.
When correct disagreement of your own beliefs with consensus is useful at scale, it eventually shifts the consensus, or else you have a source of infinite value. So almost any method of deriving significant value from private predictions being better than consensus is a method of contributing knowledge to consensus.
(Not sure what you were pointing at, mostly guessing the topic.)
For oneself, caring about reasoning and correct predictions is well worthwhile. And it requires some acknowledgement that your beliefs are private, and that they are separate from your public claims. Forgetting that this applies to others as well as yourself seems a bit strange.
I may be a bit too far on the cynicism scale, but I start with the assumption that informal predictions are both oversimplified to fit the claimant’s model of their audience, and adjusted in direction (from the true belief) to have a bigger impact on their audience.
That is, I think most public predictions are of the form “you should have a higher credence in X than you seem to”, but for greater impact STATED as “you should believe X”.