It’s probably not a good idea to feed AI an inconsistent data. For example, if evidence shows that Earth is round, but AI is absolutely sure it isn’t, it will doubt about any evidence of that, which could lead to the very weird world view.
But I think it’s possible to make AI know about the fact, but avoiding thinking about it.
It’s probably not a good idea to feed AI an inconsistent data. For example, if evidence shows that Earth is round, but AI is absolutely sure it isn’t, it will doubt about any evidence of that, which could lead to the very weird world view.
But I think it’s possible to make AI know about the fact, but avoiding thinking about it.