This sounds like it would be “natural” to use, but it would not be, because translating intention into language is cognitively effortful, and very unnatural for a very wide array of action types.
I often do not think in words about what I want to do, or want done. Indeed I often don’t think about doing the thing at all, I just do it, and insofar as there’s cognition to be done, it’s done as part of the action, transparently.
Having to translate everything into words would dramatically narrow the cognitive bandwidth between me and the effects I can accomplish with my various technological tools.
A lot of people, including me, sometimes think in words, and otherwise can effortlessly translate, so I don’t think it’s a rule that people would have to think about it too much.
Eventually, I think, everyone would acclimatize, and instead of effortlessly doing the thing, they would learn to effortlessly command the AI.
It’s an interesting point I hadn’t considered before.
Edit: I also like how both our comments are correctness-strong-downvoted by a single person, yet we more or less contradict each other. Oh, well.
Edit: I also like how both our comments are correctness-strong-downvoted by a single person, yet we more or less contradict each other. Oh, well.
Well, while it’s unlikely that we’re both right, so long as our views are not literally logical negations of each other it is surely possible for us to both be wrong…
This sounds like it would be “natural” to use, but it would not be, because translating intention into language is cognitively effortful, and very unnatural for a very wide array of action types.
I often do not think in words about what I want to do, or want done. Indeed I often don’t think about doing the thing at all, I just do it, and insofar as there’s cognition to be done, it’s done as part of the action, transparently.
Having to translate everything into words would dramatically narrow the cognitive bandwidth between me and the effects I can accomplish with my various technological tools.
A lot of people, including me, sometimes think in words, and otherwise can effortlessly translate, so I don’t think it’s a rule that people would have to think about it too much.
Eventually, I think, everyone would acclimatize, and instead of effortlessly doing the thing, they would learn to effortlessly command the AI.
It’s an interesting point I hadn’t considered before.
Edit: I also like how both our comments are correctness-strong-downvoted by a single person, yet we more or less contradict each other. Oh, well.
Well, while it’s unlikely that we’re both right, so long as our views are not literally logical negations of each other it is surely possible for us to both be wrong…