Godel theorem: there are true propositions which can’t be proved by AI (and explanation could be counted as a type of prove).
That’s what I’m fearing, so I’m trying to see if the concept makes sense.
That’s what I’m fearing, so I’m trying to see if the concept makes sense.