I think he will have a strong feeling that pi is about 3.141…
That’s the key issue. Reality is doing something here. And you know, in advance what his model will move to. You don’t think he will succeed at his event. At the end of the day, you are pretty sure that there’s something objective going on.
More starkly, I can give you mathematical examples where your intuition will be wildly at odds with the correct math. Some of those make fun games to play for money. I suspect that you won’t be willing to play them with me even if your intuition says that you should win and I shouldn’t.
That’s a bit differend from what I’m trying to say. My word choosing of intuition was clearly bad, I should have talked about mental experiences. My point is that when I do the mathematics, when I, for example, use the axioms and theorems of natural numbers to proof that 1+1 is 2, I have to rely on my memories and feelings at some point. If I use a theorem proven before, I must rely on my memories that I have proven that theorem before and correctly, but remembering is just another type of vaque mental experience. I could also remember axioms of natural numbers wrong, even if it would seem clear to me that I remember them correctly. I have to rely on the feeling of remembering correctly.
This is why I define truth as what you truly believe. Once you have carefully checked that you used all the axioms and theorems correctly, you will truly believe that you made no mistake. Then you can truly believe that 1 + 1 is 2, and it’s safe to say its the truth.
my beliefs are always the outputs of real-world embodied algorithms (for example, those associated with remembering previously proven axioms) and therefore not completely reliable.
there exists a non-empty set S1 of assertions that merit a sufficiently high degree of confidence that it is safe to call them “true” (while keeping in mind when it’s relevant that we mean “with probability 1-epsilon” rather than “with probability 1”).
I would also say that:
there exists a non-empty set S2 of assertions that don’t merit a high degree of confidence, and that it is not safe to call them true.
the embodied algorithms we use to determine our confidence in assertions are sufficiently unreliable that we sometimes possess a high degree of confidence in S2 assertions. This confidence is not merited, but we sometimes possess it nevertheless.
Would you agree with both of those statements?
Assuming you do, then it seems to follow that by “what I truly believe” you mean to exclude statements in S2. (Since otherwise, I could have a statement in S2 that I truly believe, and is therefore definitionally true, which is at the same time not safe to call true, which seems paradoxical.)
Assuming you do, then sure: if I accept that “what I truly believe” refers to S1 and not S2, then I agree that truth is what I truly believe, although that doesn’t seem like a terribly useful thing to know.
That’s the key issue. Reality is doing something here. And you know, in advance what his model will move to. You don’t think he will succeed at his event. At the end of the day, you are pretty sure that there’s something objective going on.
More starkly, I can give you mathematical examples where your intuition will be wildly at odds with the correct math. Some of those make fun games to play for money. I suspect that you won’t be willing to play them with me even if your intuition says that you should win and I shouldn’t.
That’s a bit differend from what I’m trying to say. My word choosing of intuition was clearly bad, I should have talked about mental experiences. My point is that when I do the mathematics, when I, for example, use the axioms and theorems of natural numbers to proof that 1+1 is 2, I have to rely on my memories and feelings at some point. If I use a theorem proven before, I must rely on my memories that I have proven that theorem before and correctly, but remembering is just another type of vaque mental experience. I could also remember axioms of natural numbers wrong, even if it would seem clear to me that I remember them correctly. I have to rely on the feeling of remembering correctly. This is why I define truth as what you truly believe. Once you have carefully checked that you used all the axioms and theorems correctly, you will truly believe that you made no mistake. Then you can truly believe that 1 + 1 is 2, and it’s safe to say its the truth.
FWIW: I agree with you that:
my beliefs are always the outputs of real-world embodied algorithms (for example, those associated with remembering previously proven axioms) and therefore not completely reliable.
there exists a non-empty set S1 of assertions that merit a sufficiently high degree of confidence that it is safe to call them “true” (while keeping in mind when it’s relevant that we mean “with probability 1-epsilon” rather than “with probability 1”).
I would also say that:
there exists a non-empty set S2 of assertions that don’t merit a high degree of confidence, and that it is not safe to call them true.
the embodied algorithms we use to determine our confidence in assertions are sufficiently unreliable that we sometimes possess a high degree of confidence in S2 assertions. This confidence is not merited, but we sometimes possess it nevertheless.
Would you agree with both of those statements?
Assuming you do, then it seems to follow that by “what I truly believe” you mean to exclude statements in S2. (Since otherwise, I could have a statement in S2 that I truly believe, and is therefore definitionally true, which is at the same time not safe to call true, which seems paradoxical.)
Assuming you do, then sure: if I accept that “what I truly believe” refers to S1 and not S2, then I agree that truth is what I truly believe, although that doesn’t seem like a terribly useful thing to know.
Yes, I think you managed to put my thoughts into words very well here. Probably a lot more clearly than I.