He’s talking about an AI box. Eliezer has convinced people to let out a potentially unfriendly [1] and dangerously intelligent [2] entity before, although he’s not told anyone how he did it.
[1] Think “paperclip maximizer”.
[2] Think “near-omnipotent”.
Thank you. I knew that, but didn’t make the association.
He’s talking about an AI box. Eliezer has convinced people to let out a potentially unfriendly [1] and dangerously intelligent [2] entity before, although he’s not told anyone how he did it.
[1] Think “paperclip maximizer”.
[2] Think “near-omnipotent”.
Thank you. I knew that, but didn’t make the association.