Does Gödel’s incompleteness theorem apply to AGI safety?
I understand his theorem is one of the most wildly misinterpreted in mathematics because it technically only applies to first order predicate logic, but there’s something about it that has always left me unsettled.
As far as I know, this form of logic is the best tool we’ve developed to really know things with certainty. I’m not aware of better alternatives (senses frequently are misleading, subjective knowledge is not falsifiable, etc). This has left me with the perspective that with the best tools we have we will either self contradict or not be able to prove true things we need to know with any single algorithm; everything else has limitations that are even more pronounced.
This seems like a profound issue if you’re trying to determine in advance whether or not an AI will destroy humanity.
I try to process the stream of posts on AI safety and I find myself wondering whether or not “solving” AGI safety might already be proven to be impossible with a single, formal system.
Does Gödel’s incompleteness theorem apply to AGI safety?
I understand his theorem is one of the most wildly misinterpreted in mathematics because it technically only applies to first order predicate logic, but there’s something about it that has always left me unsettled.
As far as I know, this form of logic is the best tool we’ve developed to really know things with certainty. I’m not aware of better alternatives (senses frequently are misleading, subjective knowledge is not falsifiable, etc). This has left me with the perspective that with the best tools we have we will either self contradict or not be able to prove true things we need to know with any single algorithm; everything else has limitations that are even more pronounced.
This seems like a profound issue if you’re trying to determine in advance whether or not an AI will destroy humanity.
I try to process the stream of posts on AI safety and I find myself wondering whether or not “solving” AGI safety might already be proven to be impossible with a single, formal system.