Right, so it didn’t come completely out of nowhere, but it still seems uncharitable at best to go from ‘mostly fake or pointless or predictable’ where mostly is clearly modifying the collective OR statement, to ‘almost everyone else is faking it.’
EDIT: Looks like there’s now a comment apologizing for, among other things, exactly this change.
It also seems uncharitable to go from (A) “exaggerated one of the claims in the OP” to (B) “made up the term ‘fake’ as an incorrect approximation of the true claim, which was not about fakeness”.
You didn’t literally explicitly say (B), but when you write stuff like
The term ‘faking’ here is turning a claim of ‘approaches that are being taken mostly have epsilon probability of creating meaningful progress’ to a social claim about the good faith of those doing said research, and then interpreted as a social attack, and then therefore as an argument from authority and a status claim, as opposed to pointing out that such moves don’t win the game and we need to play to win the game.
I think most (> 80%) reasonable people would take (B) away from your description, rather than (A).
Just to be totally clear: I’m not denying that the original comment was uncharitable, I’m pushing back on your description of it.
I validate this as a nonfake alignment research direction that seems important.
If he viewed almost all alignment work as nonfake, it wouldn’t be worth noting in his praise of RR. I bring this up because “EY thinks most alignment work is fake” seems to me to be a non-crazy takeaway from the post, even if it’s not true.
(I also think that “totally unpromising” is the normal way to express “approaches that are being taken mostly have epsilon probability of creating meaningful progress”, not “fake.”)
Right, so it didn’t come completely out of nowhere, but it still seems uncharitable at best to go from ‘mostly fake or pointless or predictable’ where mostly is clearly modifying the collective OR statement, to ‘almost everyone else is faking it.’
EDIT: Looks like there’s now a comment apologizing for, among other things, exactly this change.
It also seems uncharitable to go from (A) “exaggerated one of the claims in the OP” to (B) “made up the term ‘fake’ as an incorrect approximation of the true claim, which was not about fakeness”.
You didn’t literally explicitly say (B), but when you write stuff like
I think most (> 80%) reasonable people would take (B) away from your description, rather than (A).
Just to be totally clear: I’m not denying that the original comment was uncharitable, I’m pushing back on your description of it.
It’s not like this is the first time Eliezer has said “fake”, either:
If he viewed almost all alignment work as nonfake, it wouldn’t be worth noting in his praise of RR. I bring this up because “EY thinks most alignment work is fake” seems to me to be a non-crazy takeaway from the post, even if it’s not true.
(I also think that “totally unpromising” is the normal way to express “approaches that are being taken mostly have epsilon probability of creating meaningful progress”, not “fake.”)