If it is not a false memory, I’ve seen this on twitter of either EY or Rob Bensinger, but it’s unlikely I find source now, it was in the middle of discussion.
Fair enough, thank you! Regardless, it does seem like a good reason to be concerned about alignment. If you have no idea how intelligence works, how in the world would you know what goals your created intelligence is going to have? At that point, it is like alchemy—performing an incantation and hoping not just that you got it right, but that it does the thing you want.
If it is not a false memory, I’ve seen this on twitter of either EY or Rob Bensinger, but it’s unlikely I find source now, it was in the middle of discussion.
Fair enough, thank you! Regardless, it does seem like a good reason to be concerned about alignment. If you have no idea how intelligence works, how in the world would you know what goals your created intelligence is going to have? At that point, it is like alchemy—performing an incantation and hoping not just that you got it right, but that it does the thing you want.