Also, look at his bet with Bryan Caplan. He’s not joking.
And, also, Jesus, Everyone! Gradient Descent, is just, like, a deadly architecture. When I think about current architectures, they make Azathoth look smart and cuddly. There’s nothing friendly in there, even if we can get cool stuff out right now.
I don’t even know anymore what it is like to not see it this way. Does anyone have a good defense that current ML techniques can be stopped from having a deadly range of action?
Probably not; Eliezer addressed this in Q6 of the post, and while it’s a little ambiguous, I think Eliezer’s interactions with people who overwhelmingly took it seriously basically prove that it was serious; see in particular this interaction.
(But can we not downvote everyone into oblivion just for drawing the obvious conclusion without checking?)
April Fools!
Also, look at his bet with Bryan Caplan. He’s not joking.
And, also, Jesus, Everyone! Gradient Descent, is just, like, a deadly architecture. When I think about current architectures, they make Azathoth look smart and cuddly. There’s nothing friendly in there, even if we can get cool stuff out right now.
I don’t even know anymore what it is like to not see it this way. Does anyone have a good defense that current ML techniques can be stopped from having a deadly range of action?
Probably not; Eliezer addressed this in Q6 of the post, and while it’s a little ambiguous, I think Eliezer’s interactions with people who overwhelmingly took it seriously basically prove that it was serious; see in particular this interaction.
(But can we not downvote everyone into oblivion just for drawing the obvious conclusion without checking?)
I first heard Eliezer describe “dying with dignity” as a strategy in October 2021. I’m pretty sure he really means it.