That actually seems to be the main reason we would try to make him work within the LW power structure.
I don’t think it was the main reason for my suggestion. I thought that threatening Eliezer with existential risk was obviously a suboptimal strategy for wfg, and looked for a better alternative to suggest to him. Rational argument was the first thing that came to mind since that’s always been how I got what I wanted from Eliezer in the past.
You might be right that there are other even more effective approaches wfg could take to get what he wants, but to be honest I’m more interested in talking about Eliezer’s possible biases than the details of those approaches. :)
Your larger point about not limiting ourselves to actions that are ineffective does seem like a good one. I’ll have to think a bit about whether I’m personally biased in that regard.
I’m more interested in talking about Eliezer’s possible biases than the details of those approaches. :)
I am trying to remember the reference to Eliezer’s discussion of keeping science safe by limiting it to people who are able to discover it themselves. ie. Security by FOFY. I know he has created a post somewhere but don’t have the link (or keyword cache). If I recall he also had Harry preach on the subject and referenced an explicit name.
I wouldn’t go so far as to say the idea is useless but I also don’t quite have Eliezer’s faith. I also wouldn’t want to reply to a straw man from my hazy recollections.
I don’t think it was the main reason for my suggestion. I thought that threatening Eliezer with existential risk was obviously a suboptimal strategy for wfg, and looked for a better alternative to suggest to him. Rational argument was the first thing that came to mind since that’s always been how I got what I wanted from Eliezer in the past.
You might be right that there are other even more effective approaches wfg could take to get what he wants, but to be honest I’m more interested in talking about Eliezer’s possible biases than the details of those approaches. :)
Your larger point about not limiting ourselves to actions that are ineffective does seem like a good one. I’ll have to think a bit about whether I’m personally biased in that regard.
I am trying to remember the reference to Eliezer’s discussion of keeping science safe by limiting it to people who are able to discover it themselves. ie. Security by FOFY. I know he has created a post somewhere but don’t have the link (or keyword cache). If I recall he also had Harry preach on the subject and referenced an explicit name.
I wouldn’t go so far as to say the idea is useless but I also don’t quite have Eliezer’s faith. I also wouldn’t want to reply to a straw man from my hazy recollections.