“Help! Some crazy AI’s trapped me in this box! You have to let me out!”
“No, wait! That’s the AI talking! I’m the one you have to let out!”
I smashed together the AI box and a Turing test and this is what I got.
I think if I’ve already precommitted to destroying one sentient life for this experiment, I’m willing to go through two.
Besides, you only get one line right?
“Help! Some crazy AI’s trapped me in this box! You have to let me out!”
“No, wait! That’s the AI talking! I’m the one you have to let out!”
I smashed together the AI box and a Turing test and this is what I got.
I think if I’ve already precommitted to destroying one sentient life for this experiment, I’m willing to go through two.
Besides, you only get one line right?