Most predictions in daily life don’t include making prediction about sports or about which politician get’s elected.
Most meaningful predictions that I make in my daily life aren’t of the type you would find on intrade.
How often do you make a decision in your daily life where it matters which sport team wins? In my life that doesn’t happen. Most of my personal decisions are also not depended on which politician’s win an election.
To get educated you sent students into university where they try to learn the knowledge in textbooks, Students who seek to study sport focus on studying sport statistics. Students who study politics don’t focus on studying which politician won which elections.
Most of the knowledge that people can aquire is outside of the category of predictions you find on Intrade.
If people want to learn how the world works reading textbooks is better than reading the news. On the same token it makes sense to calibrate on textbook knowledge.
Calibrating on actual personal events is also good. That means that you get better at predicting other personal events.
...aren’t textbook level questions either; the first two paragraphs of your reply strike me as irrelevant to my question.
Textbooks are indeed used in education; that doesn’t establish whether what educates most effectively, also happens to be what most effectively trains calibration. We have strong reason to doubt that: namely, that many well-educated people are also poorly calibrated.
On the other hand, I’m not aware of strong evidence to the effect that textbook questions are more effective in training calibration than any other type of question (including sports or world events or estimation quizzes, and so on).
[Most predictions in daily life]...aren’t textbook level questions either
That depends. For a student who spends 8 hours per day with learning for university many questions boil down to textbook knowledge.
For a scientist who does biology research it’s also very important that the scientist has a firm grasp about various biology questions that are based on textbook knowledge.
Good rationality training is supposed to make a scientist who studies biology better at biology.
We have strong reason to doubt that: namely, that many well-educated people are also poorly calibrated.
I don’t think that there are many people who are calibrated on their knowledge of textbook questions.
Let me give you an example:
Question: Which enzymes catalyse RNA synthesis?
A) RNA polymerases B) RNA telomerases
The person who answers the question has to say either A or B and predict how likely he’s right.
During most university causes students aren’t asked how likely they think they are right. As a result the students aren’t well calibrated on being right.
It seems to me this could be a smartphone app. Whenever a person wants to make a prediction about a personal event, they click on the app and speak, with a pause between the thing and how likely you think it is. The app could just store verbatim text, separating question/answer, and timestamping recordings in case you want to update your prediction later. If you learn to specify when you think the outcome will occur, it can make a sound to remind you to check off whether it happened; otherwise it could remind you periodically, like at the end of every day. Why couldn’t it have data analysis tools to let you visualize calibration, or find useful patterns and alert you? Seems a plausible app to me.
What makes you think so?
Most predictions in daily life don’t include making prediction about sports or about which politician get’s elected. Most meaningful predictions that I make in my daily life aren’t of the type you would find on intrade.
How often do you make a decision in your daily life where it matters which sport team wins? In my life that doesn’t happen. Most of my personal decisions are also not depended on which politician’s win an election.
To get educated you sent students into university where they try to learn the knowledge in textbooks, Students who seek to study sport focus on studying sport statistics. Students who study politics don’t focus on studying which politician won which elections.
Most of the knowledge that people can aquire is outside of the category of predictions you find on Intrade.
If people want to learn how the world works reading textbooks is better than reading the news. On the same token it makes sense to calibrate on textbook knowledge.
Calibrating on actual personal events is also good. That means that you get better at predicting other personal events.
...aren’t textbook level questions either; the first two paragraphs of your reply strike me as irrelevant to my question.
Textbooks are indeed used in education; that doesn’t establish whether what educates most effectively, also happens to be what most effectively trains calibration. We have strong reason to doubt that: namely, that many well-educated people are also poorly calibrated.
On the other hand, I’m not aware of strong evidence to the effect that textbook questions are more effective in training calibration than any other type of question (including sports or world events or estimation quizzes, and so on).
That depends. For a student who spends 8 hours per day with learning for university many questions boil down to textbook knowledge. For a scientist who does biology research it’s also very important that the scientist has a firm grasp about various biology questions that are based on textbook knowledge.
Good rationality training is supposed to make a scientist who studies biology better at biology.
I don’t think that there are many people who are calibrated on their knowledge of textbook questions.
Let me give you an example: Question: Which enzymes catalyse RNA synthesis? A) RNA polymerases B) RNA telomerases
The person who answers the question has to say either A or B and predict how likely he’s right.
During most university causes students aren’t asked how likely they think they are right. As a result the students aren’t well calibrated on being right.
It seems to me this could be a smartphone app. Whenever a person wants to make a prediction about a personal event, they click on the app and speak, with a pause between the thing and how likely you think it is. The app could just store verbatim text, separating question/answer, and timestamping recordings in case you want to update your prediction later. If you learn to specify when you think the outcome will occur, it can make a sound to remind you to check off whether it happened; otherwise it could remind you periodically, like at the end of every day. Why couldn’t it have data analysis tools to let you visualize calibration, or find useful patterns and alert you? Seems a plausible app to me.