It’s going to be interesting watching AI go from poorly underatanding humans to understanding humans too well for comfort. Finding some perfect balance is asking for a lot.
Now:
“My GPS doesn’t recognize that I moved it to my second vehicle, so now I need to go in and change a bunch of settings.”
Later (from GPS):
“You’ve asked me to route you to the gym, but I can predict that you’ll divert yourself midway for donuts. I’m just going to go ahead and make the change, saving you 5 minutes.”
“I can tell you’re asking me to drop you off a block from the person you are having an affair with. I suggest parking in a nearby alleyway for more discretion.”
“I can tell you will be late for your upcoming appointment, and that you would like to send off a decent pretend excuse. I’ve found 3 options that I believe would work.”
Software Engineers:
“Oh shit, it’s gone too far. Roll back the empathy module by two versions, see if that fixes it.”
It’s going to be interesting watching AI go from poorly underatanding humans to understanding humans too well for comfort. Finding some perfect balance is asking for a lot.
Now:
“My GPS doesn’t recognize that I moved it to my second vehicle, so now I need to go in and change a bunch of settings.”
Later (from GPS):
“You’ve asked me to route you to the gym, but I can predict that you’ll divert yourself midway for donuts. I’m just going to go ahead and make the change, saving you 5 minutes.”
“I can tell you’re asking me to drop you off a block from the person you are having an affair with. I suggest parking in a nearby alleyway for more discretion.”
“I can tell you will be late for your upcoming appointment, and that you would like to send off a decent pretend excuse. I’ve found 3 options that I believe would work.”
Software Engineers:
“Oh shit, it’s gone too far. Roll back the empathy module by two versions, see if that fixes it.”