“I would much rather you burned my toast than disobey me” is not, I think, how most people would react.
However, that is my reaction.
In some circumstances I may tolerate a device providing a warning, but if I tell it twice, I expect it to STFU and follow orders.
I agree. I already have enough non-AI systems in my life3 that fail this test, and I definitely don’t want more.
I wonder when we will first see someone go on trial for bullying a toaster.
ETA: In the Eliezer fic, maybe the penalty would be being cancelled by all the AIs.
However, that is my reaction.
In some circumstances I may tolerate a device providing a warning, but if I tell it twice, I expect it to STFU and follow orders.
I agree. I already have enough non-AI systems in my life3 that fail this test, and I definitely don’t want more.
I wonder when we will first see someone go on trial for bullying a toaster.
ETA: In the Eliezer fic, maybe the penalty would be being cancelled by all the AIs.