I’m not sure I’m clear on the AI/AIG distinction. Wouldn’t an AI need to be able to apply its intelligence to novel situations to be “intelligent” at all, therefore making its intelligence “general” by definition? Watson winning Jeopardy! was a testament to software engineering, but Watson was programmed specifically to play Jeopardy!. If, without modification, it could go on to dominate Settlers of Catan then we might want to start worrying.
I guess it’s natural that QI tests would be chosen. They are objective and feature a logic a computer can, at least theoretically, recreate or approximate convincingly. Plus a lot of people conflate IQ with intelligence, which helps on the marketing side. (Aside: if there is one place the mind excels, it’s getting more out than it started with—like miraculously remembering something otherwise forgotten (in some case seemingly never learned) at just the right moment. Word-vector embeddings and other fancy relational strategies seem to need way more going in—data-wise—than the chuck back out, making them crude and brute force by comparison.)
I’m not sure I’m clear on the AI/AIG distinction. Wouldn’t an AI need to be able to apply its intelligence to novel situations to be “intelligent” at all, therefore making its intelligence “general” by definition? Watson winning Jeopardy! was a testament to software engineering, but Watson was programmed specifically to play Jeopardy!. If, without modification, it could go on to dominate Settlers of Catan then we might want to start worrying.
I guess it’s natural that QI tests would be chosen. They are objective and feature a logic a computer can, at least theoretically, recreate or approximate convincingly. Plus a lot of people conflate IQ with intelligence, which helps on the marketing side. (Aside: if there is one place the mind excels, it’s getting more out than it started with—like miraculously remembering something otherwise forgotten (in some case seemingly never learned) at just the right moment. Word-vector embeddings and other fancy relational strategies seem to need way more going in—data-wise—than the chuck back out, making them crude and brute force by comparison.)