Ian C., that is half the philosophy of Epicurus in a nutshell: there are no gods, there is no afterlife, so the worst case scenario is not subject to the whims of petulant deities.
If you want a sufficient response to optimism, consider: is the probability that you will persist forever 1? If not, it is 0. If there is any probability of your annihilation, no matter how small, you will not survive for an infinite amount of time. That is what happens in an infinite amount of time: everything possible. If all your backup plans can fail at once, even at P=1/(3^^^3), that number will come of eventually with infinite trials.
Not necessarily. If the risk decreases faster than an inverse function (ie. if the risk is less than 1/n for each event, where n is the number of events), there can be a probability between 0 and 1.
Unless you make one more Horcrux than yesterday each day, that’s never going to happen. And there’s still the finite, fixed, non-zero chance of the magic widget being destroyed and all of your backups failing simultaneously, or the false vacuum collapsing. Unless you seriously think you can think up completely novel ways to prevent your death at a constantly-accelerating rate, with no duplicates, many of which can avoid hypothetical universe-ending apocalypses.
Unless we find a way to escape the known universe, or discover something similarly munchkinneritorial, we’re all going to die.
With all the sci fi brought up here, I think we are familiar with Hitler’s Time Travel Exemption Act.
Ian C., that is half the philosophy of Epicurus in a nutshell: there are no gods, there is no afterlife, so the worst case scenario is not subject to the whims of petulant deities.
If you want a sufficient response to optimism, consider: is the probability that you will persist forever 1? If not, it is 0. If there is any probability of your annihilation, no matter how small, you will not survive for an infinite amount of time. That is what happens in an infinite amount of time: everything possible. If all your backup plans can fail at once, even at P=1/(3^^^3), that number will come of eventually with infinite trials.
Not necessarily. If the risk decreases faster than an inverse function (ie. if the risk is less than 1/n for each event, where n is the number of events), there can be a probability between 0 and 1.
Unless you make one more Horcrux than yesterday each day, that’s never going to happen. And there’s still the finite, fixed, non-zero chance of the magic widget being destroyed and all of your backups failing simultaneously, or the false vacuum collapsing. Unless you seriously think you can think up completely novel ways to prevent your death at a constantly-accelerating rate, with no duplicates, many of which can avoid hypothetical universe-ending apocalypses.
Unless we find a way to escape the known universe, or discover something similarly munchkinneritorial, we’re all going to die.