There are mathematical statements that are provable that some lawyer has not been exposed to, and so doesn’t think is true (or false).
Doesn’t strongly believe are true, or false, or other. But the mind is not a void before the training, and after getting a degree in math still won’t be a void, but neither will it be a computer immune to gamma rays and quantum effects, working in PA with a proof that PA is consistent that uses only PA. It will be a fallible thing with good reason to believe it has correctly read and parsed definitions, etc.
If you do correct
We’re talking about errors I am committing without having detected. You discuss the case where I attempt to believe falsely, and accidentally believe something true? Or similar?
And if you’d like to estimate the proportion of the time you make errors, that’s likely to be helpful in your decision-making, but it doesn’t convert non-empirical statements into empirical statements.
Unfortunately, I have good reason to believe I imperfectly sort statements along the empirical/non-empirical divide.
Unfortunately, I have good reason to believe I imperfectly sort statements along the empirical/non-empirical divide.
“1 + 2 = 3” is a statement that lacks empirical content. “F = ma” is a statement that has empirical content and is falsifiable. “The way to maximize human flourishing is to build a friendly AI that implements CEV(everyone)” is a statement with empirical content that is not falsifiable.
Folk philosophers do a terrible job distinguishing between the categories “lacks empirical content” and “is not falsifiable.” Does that prove the categories are identical?
We’re talking about errors I am committing without having detected. You discuss the case where I attempt to believe falsely, and accidentally believe something true? Or similar?
I’m sorry, I don’t understand the question.
neither will it be a computer immune to gamma rays and quantum effects
Yes, there are ways to become delusion [delusional—oops]. It is worthwhile to estimate the likelihood of this possibility, but that isn’t what I’m trying to do here.
“1 + 2 = 3” is a statement that lacks empirical content. “F = ma” is a statement that has empirical content and is falsifiable. “The way to maximize human flourishing is to build a friendly AI that implements CEV(everyone)” is a statement with empirical content that is not falsifiable.
If you’re trying to demonstrate perfect ability to sort all statements into three bins, you have a lot more typing to do. If not, I don’t understand your point. Either you’re perfect at sorting such statements, or not. If not, there is a limit to how sure you should be that you correctly sorted each.
If you do correct [errors that you made but have not identified—Ed.], then the fact of the error doesn’t tell you about the statement under investigation.
I don’t know what this means.
there are ways to become delusion
?
It is worthwhile to estimate the likelihood of this possibility, but that isn’t what I’m trying to do here.
For each statement I believe true, I should estimate the chances of it being true < 1.
If you’re trying to demonstrate perfect ability to sort all statements into three bins, you have a lot more typing to do. If not, I don’t understand your point. Either you’re perfect at sorting such statements, or not. If not, there is a limit to how sure you should be that you correctly sorted each.
It is interesting that the all statements that we would like to be able to assign truth value to can be sorted into one of these three bins. Additional bins are not necessary, and fewer bins would be insufficient.
Doesn’t strongly believe are true, or false, or other. But the mind is not a void before the training, and after getting a degree in math still won’t be a void, but neither will it be a computer immune to gamma rays and quantum effects, working in PA with a proof that PA is consistent that uses only PA. It will be a fallible thing with good reason to believe it has correctly read and parsed definitions, etc.
We’re talking about errors I am committing without having detected. You discuss the case where I attempt to believe falsely, and accidentally believe something true? Or similar?
Unfortunately, I have good reason to believe I imperfectly sort statements along the empirical/non-empirical divide.
“1 + 2 = 3” is a statement that lacks empirical content. “F = ma” is a statement that has empirical content and is falsifiable. “The way to maximize human flourishing is to build a friendly AI that implements CEV(everyone)” is a statement with empirical content that is not falsifiable.
Folk philosophers do a terrible job distinguishing between the categories “lacks empirical content” and “is not falsifiable.” Does that prove the categories are identical?
I’m sorry, I don’t understand the question.
Yes, there are ways to become delusion [delusional—oops]. It is worthwhile to estimate the likelihood of this possibility, but that isn’t what I’m trying to do here.
If you’re trying to demonstrate perfect ability to sort all statements into three bins, you have a lot more typing to do. If not, I don’t understand your point. Either you’re perfect at sorting such statements, or not. If not, there is a limit to how sure you should be that you correctly sorted each.
I don’t know what this means.
?
For each statement I believe true, I should estimate the chances of it being true < 1.
It is interesting that the all statements that we would like to be able to assign truth value to can be sorted into one of these three bins. Additional bins are not necessary, and fewer bins would be insufficient.