If you were in a group and you were shown a box with 5 dice in it for a brief moment, but later everyone agreed that there were only 4 dice...
This is a pretty standard example of reasoning under uncertainty. You have two possible events, “there were 5 dice” vs. “there were 4 dice”. You want to assign a probability to each event, because, not being omniscient, you don’t know how many dice there actually were. You have several percepts, meaning pieces of evidence: your memories and the claims of the other people. Each of these percepts has some probability of being true: your memories are not infallible, the other people could be wrong or lying, etc. You could run all these numbers through Bayes’ Rule, and determine which of the events (“5 dice” vs. “4 dice”) is more likely to be true.
It also helps to know that all humans have a bias when it comes to peer pressure; our memories become especially faulty when we perceive a strong group consensus that contradicts them. Knowing this can help you calibrate your probabilities.
Anyways, you say that “belief is a hard thing to cultivate”, but in your dice scenario, there’s no need to cultivate anything, because you don’t care about beliefs, you care about how many dice there were; i.e., you care specifically about the truth.
Even so though I believe that it is my biological duty to do everything possible to survive no matter how hopeless the situation.
I am not sure what “biological duty” means, but still, it sounds like you do care whether you live or die; i.e., you want to live. This is a goal, and you can take actions in order to further this goal, and you want to make sure your actions are as optimal as possible, right ?
However I am aware that simple explanations are not always the right ones.
It depends on what you mean by “simple”; according to Ockham’s Razor, “God did it” is a vastly less simple
explanation than most others, due to the number of implicit assumptions you will end up making. That said, it sounds like you have several possible events (“God did it”, “aliens did it”, “I got lucky”, etc.), and several pieces of evidence; so this scenario is similar to your example with the dice. If you cared about the truth, you could assign a probability to each event based on the evidence. Of course, you don’t have to care about the truth in this case; but if you did, there are ways you could approach it.
Other than that I could have believed, or done anything I wanted and not have really effected the outcome.
It’s possible that tying that tourniquet did save your life, so there’s at least one thing you did do which likely affected the outcome.
However if there was an asteroid going to hit tomorrow, I am not sure what help I could offer humanity even if I did know.
I think I see where you’re coming from: there’s no point in spending a lot of effort on worrying about low-probability events which you’d be powerless to affect even if they did happen. As you said, the Sun could die tomorrow, but I can safely ignore this fact. However, I think you’re making an unwarranted leap of faith when you precommit to never worrying about such events, regardless of circumstances.
For example, there’s nothing you could do about that asteroid today, and in fact it’s very likely not even coming. But if we knew that the asteroid was, indeed, heading for Earth, there could be lots of things you could do—you could donate money to the anti-asteroid fund, volunteer at the anti-asteroid missile factory, etc. If you had more information about the asteroid, you could re-evaluate your decisions, and determine the best course of action; but you can’t do that if you have committed yourself to not doing anything regardless of circumstances. You also can’t do that if you have no access to that information in the first place, which is why caring about the truth is sometimes important.
This is a pretty standard example of reasoning under uncertainty. You have two possible events, “there were 5 dice” vs. “there were 4 dice”. You want to assign a probability to each event, because, not being omniscient, you don’t know how many dice there actually were. You have several percepts, meaning pieces of evidence: your memories and the claims of the other people. Each of these percepts has some probability of being true: your memories are not infallible, the other people could be wrong or lying, etc. You could run all these numbers through Bayes’ Rule, and determine which of the events (“5 dice” vs. “4 dice”) is more likely to be true.
It also helps to know that all humans have a bias when it comes to peer pressure; our memories become especially faulty when we perceive a strong group consensus that contradicts them. Knowing this can help you calibrate your probabilities.
Anyways, you say that “belief is a hard thing to cultivate”, but in your dice scenario, there’s no need to cultivate anything, because you don’t care about beliefs, you care about how many dice there were; i.e., you care specifically about the truth.
I am not sure what “biological duty” means, but still, it sounds like you do care whether you live or die; i.e., you want to live. This is a goal, and you can take actions in order to further this goal, and you want to make sure your actions are as optimal as possible, right ?
It depends on what you mean by “simple”; according to Ockham’s Razor, “God did it” is a vastly less simple explanation than most others, due to the number of implicit assumptions you will end up making. That said, it sounds like you have several possible events (“God did it”, “aliens did it”, “I got lucky”, etc.), and several pieces of evidence; so this scenario is similar to your example with the dice. If you cared about the truth, you could assign a probability to each event based on the evidence. Of course, you don’t have to care about the truth in this case; but if you did, there are ways you could approach it.
It’s possible that tying that tourniquet did save your life, so there’s at least one thing you did do which likely affected the outcome.
I think I see where you’re coming from: there’s no point in spending a lot of effort on worrying about low-probability events which you’d be powerless to affect even if they did happen. As you said, the Sun could die tomorrow, but I can safely ignore this fact. However, I think you’re making an unwarranted leap of faith when you precommit to never worrying about such events, regardless of circumstances.
For example, there’s nothing you could do about that asteroid today, and in fact it’s very likely not even coming. But if we knew that the asteroid was, indeed, heading for Earth, there could be lots of things you could do—you could donate money to the anti-asteroid fund, volunteer at the anti-asteroid missile factory, etc. If you had more information about the asteroid, you could re-evaluate your decisions, and determine the best course of action; but you can’t do that if you have committed yourself to not doing anything regardless of circumstances. You also can’t do that if you have no access to that information in the first place, which is why caring about the truth is sometimes important.