Ah, I see. I misunderstood your point. You don’t worry about your current beliefs, you worry about the extrapolated ones. How do you see this represented? An algorithm, which, when fed a theorem, outputs a proof/disproof/proof of undecidability? Or something that spits all possible true statements, unprompted, never shutting up?
Yeah. Judging from the comments, it looks like most people misunderstood the point, though the point is right there in the title :-( I think the result of extrapolation should be an algorithm that spits out all true statements, or an algorithm that checks proofs. These are equivalent, and known as “effectively generated” or “recursively axiomatizable” theories. It can’t be an algorithm that answers provable/disprovable/independent, because that’s impossible even for PA, and I’d like to think that I believe in more than just PA.
Ah, I see. I misunderstood your point. You don’t worry about your current beliefs, you worry about the extrapolated ones. How do you see this represented? An algorithm, which, when fed a theorem, outputs a proof/disproof/proof of undecidability? Or something that spits all possible true statements, unprompted, never shutting up?
Yeah. Judging from the comments, it looks like most people misunderstood the point, though the point is right there in the title :-( I think the result of extrapolation should be an algorithm that spits out all true statements, or an algorithm that checks proofs. These are equivalent, and known as “effectively generated” or “recursively axiomatizable” theories. It can’t be an algorithm that answers provable/disprovable/independent, because that’s impossible even for PA, and I’d like to think that I believe in more than just PA.