Every statement an AI tells us will be a lie to some extent, simply in terms of being a simplification so that we can understand it.
Not so. You can definitely ask questions about complicated things that have simple answers.
Yes, that was an exaggeration—I was thinking of most real-world questions.
I was thinking of most real-world questions that aren’t of the form ‘Why X?’ or ‘How do I X?’.
“How much/many X?” → number
“When will X?” → number
“Is X?” → boolean
“What are the chances of X if I Y?” → number
Also, any answer that simplifies isn’t a lie if its simplified status is made clear.
Not so. You can definitely ask questions about complicated things that have simple answers.
Yes, that was an exaggeration—I was thinking of most real-world questions.
I was thinking of most real-world questions that aren’t of the form ‘Why X?’ or ‘How do I X?’.
“How much/many X?” → number
“When will X?” → number
“Is X?” → boolean
“What are the chances of X if I Y?” → number
Also, any answer that simplifies isn’t a lie if its simplified status is made clear.