My understanding is that waitingforgodel doesn’t particularly want to discuss that topic, but thinks that it’s important that LW’s moderation policy be changed in the future for other reasons. In that case it appears to me the best way to go about it is to try to convince Eliezer using rational arguments.
A public commitment has been made.
Commitment to a particular moderation policy?
Eliezer has a bias toward secrecy.
I’m inclined to agree, but do you have an argument that he is biased (instead of us)?
In my observation Eliezer becomes irrational when it comes to dealing with risk.
I’d be interested to know what observation you’re referring to.
Eliezer has (much) higher status. Status drastically hinders the ability to take on board other people’s ideas when they contradict your own.
True, but I’ve been able to change his mind on occasion (whereas I don’t think I’ve ever succeeded in changing Robin Hanson’s for example).
The most important rational arguments that weigh into the decision involve discussing the subject matter itself. This is forbidden.
I doubt that waitingforgodel has any arguments involving the forbidden topic itself. Again, he hasn’t shown any interest in that topic, but just in the general moderation policy.
Overall, I agree that Eliezer is unlikely to be persuaded, but it still seems to be a better chance than anything else waitingforgodel can do.
My understanding is that waitingforgodel doesn’t particularly want to discuss that topic, but thinks that it’s important that LW’s moderation policy be changed in the future for other reasons.
See waitingforgodel’s actual words on the subject. We could speculate that these “aren’t his real reasons” but they certainly are sane reasons and it isn’t usually useful to presume we know what people want despite what they say. At least for the purpose of good faith discussion if not for out personal judgement.
In that case it appears to me the best way to go about it is to try to convince Eliezer using rational arguments.
Waitingforgodel’s general goals could be achieved without relying on LW itself but in a way that essentially nullifies the censorship influence (at least in an ‘opt in’ manner), even ensuring a negligible onging trivial inconvenience. This wouldn’t be easy or likely for him to achieve but see below for a possible option. Assuming an outcome was achieved that ensured overt censorship created more discussion rather than less (Streisand Effect) it may actually become in Eliezer’s interest to allow such discussions on LW. That would remove attention from the other location and put it back to a place where he can express a greater but still sub-censorship form of influence.
Commitment to a particular moderation policy?
More so on this specific topic than the general case. You are right that it wouldn’t be violating a public commitment to not censor something unrelated.
Now, there is a distinction to be made that I consider important. Let’s not pretend this is about moderation. Moderation in any remotely conventional sense would be something that applied to Eliezer’s reply and not Roko’s post. There hasn’t been an instance of more dramatic personal abuse. The response was anything but ‘moderate’. Without for the purpose of this point labelling it good or bad this is about censoring an idea. I don’t think those who are most in support of the banning would put this in the same category as moderation.
I’m inclined to agree, but do you have an argument that he is biased (instead of us)?
Not right now but it is true that ‘him or us’ is something to consider if I was focusing on this issue. I actually typed some examples in the grandparent but removed them. I present the ‘secrecy bias’ as a premise which the reader would either share or not without getting distracted by disagreement with respect to underlying reasoning.
True, but I’ve been able to change his mind on occasion (whereas I don’t think I’ve ever succeeded in changing Robin Hanson’s for example).
This is true and I should take the time to say I’m impressed with how well Eliezer works to counter the status effect. This is something that is important to him and he handles it better than most.
I model Robin as an academic, actually using Robin’s theories on how academics can be expected to behave to make reasonably good predictions about his behavior. It isn’t often that saying someone has irrational biases actually constitutes a compliment. :)
Overall, I agree that Eliezer is unlikely to be persuaded, but it still seems to be a better chance than anything else waitingforgodel can do.
Absolutely. But then, who really thought he could significantly increase existential risk anyway?
It probably would be at least be possible for him to develop the skills and connections to create either a public place for discussion of anything and everything forbidden here and make it sufficiently visible that censorship actually increases visibility. He could even modify your new comments viewer such that it also displays discussion of the censored topics, probably highlighted for visibility.
Alternately, he could arrange a way to cooperate with other interested parties to collectively condition all their existential risk donation on certain minimum standards of behavior. This is again not coercive, just practical action. It makes a lot of sense to refrain from providing assistance to an organisation that censors all discussion regarding whether the AI they wish to create would torture people who didn’t give them all their money. It also doesn’t mean that the charitable contributors are committing to wasting the money on crack and hookers if they do not get their way. Contributions could be invested wisely and remain available to the first existential risk and opportunity organisation that meets a standard of predicted effectiveness and acceptable ethical behavior.
I doubt that waitingforgodel has any arguments involving the forbidden topic itself.
He probably doesn’t. Others, including myself, do and analyzing the subject would produce more (for and against). This suggests that it would be a bad idea for wfg to try to persuade Eliezer with rational argument himself. If his belief is that other people have arguments that are worth hearing then he is best off seeking to find a way to make them heard.
People should never limit themselves to actions that are ineffective, except when the ‘choke’ effect is in play and you need to signal low status. That actually seems to be the main reason we would try to make him work within the LW power structure. We wouldn’t, for example, tell Robin Hanson or even yourself that you should limit yourself to persuading the authority figures. We’d expect you to do what works.
I don’t think waitingforgodel will do any of these things and we could say that he could not do these things given that the motivation to gain the skills required for that sort of social influence is a trait that few people have and wfg (and most people) is unlikely to be willing to engage in the personal development that leads him in that direction.
(Thankyou for the well reasoned and non-aggressive responses. I value being able to explore the practical implications of the issue. This strikes at the core of important elements of instrumental rationality.)
(A random note: I had to look up the name for the Streisand Effect from a comment made here yesterday by Kodos. I was surprised to discover just how many comments have been made since then. I didn’t keep a count but it was a lot.)
That actually seems to be the main reason we would try to make him work within the LW power structure.
I don’t think it was the main reason for my suggestion. I thought that threatening Eliezer with existential risk was obviously a suboptimal strategy for wfg, and looked for a better alternative to suggest to him. Rational argument was the first thing that came to mind since that’s always been how I got what I wanted from Eliezer in the past.
You might be right that there are other even more effective approaches wfg could take to get what he wants, but to be honest I’m more interested in talking about Eliezer’s possible biases than the details of those approaches. :)
Your larger point about not limiting ourselves to actions that are ineffective does seem like a good one. I’ll have to think a bit about whether I’m personally biased in that regard.
I’m more interested in talking about Eliezer’s possible biases than the details of those approaches. :)
I am trying to remember the reference to Eliezer’s discussion of keeping science safe by limiting it to people who are able to discover it themselves. ie. Security by FOFY. I know he has created a post somewhere but don’t have the link (or keyword cache). If I recall he also had Harry preach on the subject and referenced an explicit name.
I wouldn’t go so far as to say the idea is useless but I also don’t quite have Eliezer’s faith. I also wouldn’t want to reply to a straw man from my hazy recollections.
My understanding is that waitingforgodel doesn’t particularly want to discuss that topic, but thinks that it’s important that LW’s moderation policy be changed in the future for other reasons. In that case it appears to me the best way to go about it is to try to convince Eliezer using rational arguments.
Commitment to a particular moderation policy?
I’m inclined to agree, but do you have an argument that he is biased (instead of us)?
I’d be interested to know what observation you’re referring to.
True, but I’ve been able to change his mind on occasion (whereas I don’t think I’ve ever succeeded in changing Robin Hanson’s for example).
I doubt that waitingforgodel has any arguments involving the forbidden topic itself. Again, he hasn’t shown any interest in that topic, but just in the general moderation policy.
Overall, I agree that Eliezer is unlikely to be persuaded, but it still seems to be a better chance than anything else waitingforgodel can do.
See waitingforgodel’s actual words on the subject. We could speculate that these “aren’t his real reasons” but they certainly are sane reasons and it isn’t usually useful to presume we know what people want despite what they say. At least for the purpose of good faith discussion if not for out personal judgement.
Waitingforgodel’s general goals could be achieved without relying on LW itself but in a way that essentially nullifies the censorship influence (at least in an ‘opt in’ manner), even ensuring a negligible onging trivial inconvenience. This wouldn’t be easy or likely for him to achieve but see below for a possible option. Assuming an outcome was achieved that ensured overt censorship created more discussion rather than less (Streisand Effect) it may actually become in Eliezer’s interest to allow such discussions on LW. That would remove attention from the other location and put it back to a place where he can express a greater but still sub-censorship form of influence.
More so on this specific topic than the general case. You are right that it wouldn’t be violating a public commitment to not censor something unrelated.
Now, there is a distinction to be made that I consider important. Let’s not pretend this is about moderation. Moderation in any remotely conventional sense would be something that applied to Eliezer’s reply and not Roko’s post. There hasn’t been an instance of more dramatic personal abuse. The response was anything but ‘moderate’. Without for the purpose of this point labelling it good or bad this is about censoring an idea. I don’t think those who are most in support of the banning would put this in the same category as moderation.
Not right now but it is true that ‘him or us’ is something to consider if I was focusing on this issue. I actually typed some examples in the grandparent but removed them. I present the ‘secrecy bias’ as a premise which the reader would either share or not without getting distracted by disagreement with respect to underlying reasoning.
This is true and I should take the time to say I’m impressed with how well Eliezer works to counter the status effect. This is something that is important to him and he handles it better than most.
I model Robin as an academic, actually using Robin’s theories on how academics can be expected to behave to make reasonably good predictions about his behavior. It isn’t often that saying someone has irrational biases actually constitutes a compliment. :)
Absolutely. But then, who really thought he could significantly increase existential risk anyway?
It probably would be at least be possible for him to develop the skills and connections to create either a public place for discussion of anything and everything forbidden here and make it sufficiently visible that censorship actually increases visibility. He could even modify your new comments viewer such that it also displays discussion of the censored topics, probably highlighted for visibility.
Alternately, he could arrange a way to cooperate with other interested parties to collectively condition all their existential risk donation on certain minimum standards of behavior. This is again not coercive, just practical action. It makes a lot of sense to refrain from providing assistance to an organisation that censors all discussion regarding whether the AI they wish to create would torture people who didn’t give them all their money. It also doesn’t mean that the charitable contributors are committing to wasting the money on crack and hookers if they do not get their way. Contributions could be invested wisely and remain available to the first existential risk and opportunity organisation that meets a standard of predicted effectiveness and acceptable ethical behavior.
He probably doesn’t. Others, including myself, do and analyzing the subject would produce more (for and against). This suggests that it would be a bad idea for wfg to try to persuade Eliezer with rational argument himself. If his belief is that other people have arguments that are worth hearing then he is best off seeking to find a way to make them heard.
People should never limit themselves to actions that are ineffective, except when the ‘choke’ effect is in play and you need to signal low status. That actually seems to be the main reason we would try to make him work within the LW power structure. We wouldn’t, for example, tell Robin Hanson or even yourself that you should limit yourself to persuading the authority figures. We’d expect you to do what works.
I don’t think waitingforgodel will do any of these things and we could say that he could not do these things given that the motivation to gain the skills required for that sort of social influence is a trait that few people have and wfg (and most people) is unlikely to be willing to engage in the personal development that leads him in that direction.
(Thankyou for the well reasoned and non-aggressive responses. I value being able to explore the practical implications of the issue. This strikes at the core of important elements of instrumental rationality.)
(A random note: I had to look up the name for the Streisand Effect from a comment made here yesterday by Kodos. I was surprised to discover just how many comments have been made since then. I didn’t keep a count but it was a lot.)
I don’t think it was the main reason for my suggestion. I thought that threatening Eliezer with existential risk was obviously a suboptimal strategy for wfg, and looked for a better alternative to suggest to him. Rational argument was the first thing that came to mind since that’s always been how I got what I wanted from Eliezer in the past.
You might be right that there are other even more effective approaches wfg could take to get what he wants, but to be honest I’m more interested in talking about Eliezer’s possible biases than the details of those approaches. :)
Your larger point about not limiting ourselves to actions that are ineffective does seem like a good one. I’ll have to think a bit about whether I’m personally biased in that regard.
I am trying to remember the reference to Eliezer’s discussion of keeping science safe by limiting it to people who are able to discover it themselves. ie. Security by FOFY. I know he has created a post somewhere but don’t have the link (or keyword cache). If I recall he also had Harry preach on the subject and referenced an explicit name.
I wouldn’t go so far as to say the idea is useless but I also don’t quite have Eliezer’s faith. I also wouldn’t want to reply to a straw man from my hazy recollections.