When I tried to answer why we don’t trade with ants myself, communication was one of the first things (I can’t remember what was actually first) I considered. But I worry it may be more analogous to AI than argued here.
We sort of can communicate with ants. We know to some degree what makes them tick, it’s just we mostly use that communication to lie to them and tell them this poison is actually really tasty. The issue may be less that communication is impossible, and more that it’s too costly to figure out, and so no one tries to become Antman even if they could cut their janatorial costs by a factor of 7.
The next thought I had was that, if I were to try to get ants to clean my room, I think it’s likely that the easiest line towards that is not figuring out how to communicate, but breeding some ants with different behavior (e.g. search for small bits of food, instead of large bits. This seems harder than that sentence suggests, but still probably easier than learning to speak ant). I don’t like what that would be analogous to in human-AI interactions.
I think it’s possible that an AI could fruitfully trade with humans. While it lacks a body, posting an ad on Craigslist to get someone to move something heavy is probably easier than figuring out how to hijack a wifi-enabled crane or something.
But I don’t know how quickly that changes. If the AI is trying to build a sci-fi gadget, it’s possible that an instruction set to build it is long or complicated enough that a human has trouble following it accurately. The costs of writing intuitive instructions, and also designing the gadget such that idiot-proof construction is possible could be high enough that it’s better to do it itself.
When I tried to answer why we don’t trade with ants myself, communication was one of the first things (I can’t remember what was actually first) I considered. But I worry it may be more analogous to AI than argued here.
We sort of can communicate with ants. We know to some degree what makes them tick, it’s just we mostly use that communication to lie to them and tell them this poison is actually really tasty. The issue may be less that communication is impossible, and more that it’s too costly to figure out, and so no one tries to become Antman even if they could cut their janatorial costs by a factor of 7.
The next thought I had was that, if I were to try to get ants to clean my room, I think it’s likely that the easiest line towards that is not figuring out how to communicate, but breeding some ants with different behavior (e.g. search for small bits of food, instead of large bits. This seems harder than that sentence suggests, but still probably easier than learning to speak ant). I don’t like what that would be analogous to in human-AI interactions.
I think it’s possible that an AI could fruitfully trade with humans. While it lacks a body, posting an ad on Craigslist to get someone to move something heavy is probably easier than figuring out how to hijack a wifi-enabled crane or something.
But I don’t know how quickly that changes. If the AI is trying to build a sci-fi gadget, it’s possible that an instruction set to build it is long or complicated enough that a human has trouble following it accurately. The costs of writing intuitive instructions, and also designing the gadget such that idiot-proof construction is possible could be high enough that it’s better to do it itself.