The word “right” (without the use of modifiers such as “exactly”) might sound too weak and easily satisfiable, but I think the idea is the following:
Theories that may seem complete and robust today might be found to be incomplete or wrong in the future. You cannot claim certainty in them, although you can probably claim high confidence under certain conditions.
You can’t ever claim absolute certainty in anything. There’s no 1.0 probability in predictions about the universe. But science can create claims of being “right” as strong and justified as any other known process. Saying “science doesn’t claim to get things right” is false, unless you go on to say “nothing can (correctly) claim to get things right, it’s epistemologically impossible”.
We are certain because we treat it as an axiom (loosely speaking) of our epistemology. We’re as certain of it as we would be of a logical truth. In practice we are fallible and can make mistakes about logical truths. But in theory, they are absolutely certain; they have a higher status than mere beliefs about the universe.
Assigning 1.0 probability to any proposition means you can never update away from that probability, no matter what the evidence. That means there’s a part of your map that isn’t causally entangled with the territory. What we’re certain of is that there shouldn’t be such a part.
The word “right” (without the use of modifiers such as “exactly”) might sound too weak and easily satisfiable, but I think the idea is the following: Theories that may seem complete and robust today might be found to be incomplete or wrong in the future. You cannot claim certainty in them, although you can probably claim high confidence under certain conditions.
You can’t ever claim absolute certainty in anything. There’s no 1.0 probability in predictions about the universe. But science can create claims of being “right” as strong and justified as any other known process. Saying “science doesn’t claim to get things right” is false, unless you go on to say “nothing can (correctly) claim to get things right, it’s epistemologically impossible”.
but are we 100% CERTAIN there’s no 1.0 probability in predictions about the universe?
We are certain because we treat it as an axiom (loosely speaking) of our epistemology. We’re as certain of it as we would be of a logical truth. In practice we are fallible and can make mistakes about logical truths. But in theory, they are absolutely certain; they have a higher status than mere beliefs about the universe.
Assigning 1.0 probability to any proposition means you can never update away from that probability, no matter what the evidence. That means there’s a part of your map that isn’t causally entangled with the territory. What we’re certain of is that there shouldn’t be such a part.