It’s hard to see how an AGI can get very far without being able to deal with logical/mathematical uncertainty.
Do you have an intuition as to how it would do this without contradicting itself? I tried to ask a similar question but got it wrong in the first draft and afaict did not receive an answer to the relevant part.
I just want to know if my own intuition fails in the obvious way.
Do you have an intuition as to how it would do this without contradicting itself? I tried to ask a similar question but got it wrong in the first draft and afaict did not receive an answer to the relevant part.
I just want to know if my own intuition fails in the obvious way.