I like your example but there is additional evidence that could be gathered to refine your premise. You can check the traffic situation along your route and make summations about travel time. So there is a chance, given additional tools to up the chances of “everything is fine” to be the more likely scenario over not. I think this is especially true for those of us that drive cars. If you and I decide to go to the Denver Art Museum and you are coming from a hotel in downtown Denver and I’m driving from my house out of town whether I’m gong to be on time or not depends on all the factors you mentioned. However, I can mitigate some of those factors by adding data. I can do the same thing for you by empowering you with a map or by guiding you towards a tool like Google maps to get you from your hotel to the museum more efficiently. I think when you live someplace for a time and you make a trip regularly you get used to certain ideas about your journey which is why “everything is fine” is usually picked by people. To try to compensate for every eventuality is mind-numbing. However, I think making proper use of tools to make things as efficient as possible is also a good idea.
However, I am very much in favor of this line of thinking.
Making sure I understood you: you are saying that people sometimes pick “everything is fine” because:
1) they are confident that if anything goes wrong, they would be able to fix it, so everything is fine once again
2) they are so confident in it they aren’t making specific plans, beliving that they would be able to fix everything on the spur of the moment
aren’t you?
Looks plausible, but something must be wrong there, because planning fallacy:
a) exists (so people aren’t evaluating their abilities well)
b) exists even people aren’t familiar with the situation they are predicting (here, people have no ground for “ah, I’m able to fix anything anyway” effect)
c) exists even in people with low confidence (however, maybe the effect is weaker here; it’s an interesting theory to test)
I blame overconfidence and similar self-serving biases.
I like your example but there is additional evidence that could be gathered to refine your premise. You can check the traffic situation along your route and make summations about travel time. So there is a chance, given additional tools to up the chances of “everything is fine” to be the more likely scenario over not. I think this is especially true for those of us that drive cars. If you and I decide to go to the Denver Art Museum and you are coming from a hotel in downtown Denver and I’m driving from my house out of town whether I’m gong to be on time or not depends on all the factors you mentioned. However, I can mitigate some of those factors by adding data. I can do the same thing for you by empowering you with a map or by guiding you towards a tool like Google maps to get you from your hotel to the museum more efficiently. I think when you live someplace for a time and you make a trip regularly you get used to certain ideas about your journey which is why “everything is fine” is usually picked by people. To try to compensate for every eventuality is mind-numbing. However, I think making proper use of tools to make things as efficient as possible is also a good idea.
However, I am very much in favor of this line of thinking.
Making sure I understood you: you are saying that people sometimes pick “everything is fine” because:
1) they are confident that if anything goes wrong, they would be able to fix it, so everything is fine once again
2) they are so confident in it they aren’t making specific plans, beliving that they would be able to fix everything on the spur of the moment
aren’t you?
Looks plausible, but something must be wrong there, because planning fallacy:
a) exists (so people aren’t evaluating their abilities well)
b) exists even people aren’t familiar with the situation they are predicting (here, people have no ground for “ah, I’m able to fix anything anyway” effect)
c) exists even in people with low confidence (however, maybe the effect is weaker here; it’s an interesting theory to test)
I blame overconfidence and similar self-serving biases.