I’m surprised and disconcerted that some people might be so afraid of being rebuked by Eliezer as to be reluctant to criticize/correct him even when such incontrovertible evidence is available showing that he’s wrong. Your comment also made me recall another comment you wrote a couple of years ago about how my status in this community made a criticism of you feel like a “huge insult”, which I couldn’t understand at the time and just ignored.
I wonder how many other people feel this strongly about being criticized/insulted by a high status person (I guess at least Roko also felt strongly enough about being called “stupid” by Eliezer to contribute to him leaving this community a few days later), and whether Eliezer might not be aware of this effect he is having on others.
Your comment also made me recall another comment you [Kip] wrote a couple of years ago about how my status in this community made a criticism of you feel like a “huge insult”, which I couldn’t understand at the time and just ignored.
My brain really, really does not want to update on the numerous items of evidence available to it that it can hit people much much harder now, owing to community status, than when it was 12 years old.
(nods) I’ve wondered this many times. I have also at times wondered if EY is adopting the “slam the door three times” approach to prospective members of his community, though I consider this fairly unlikely given other things he’s said.
Somewhat relatedly, I remember when lukeprog first joined the site, he and EY got into an exchange that from what I recall of my perspective as a completely uninvolved third party involved luke earnestly trying to offer assistance and EY being confidently dismissive of any assistance someone like luke could provide, and at the time I remember feeling sort of sorry for luke, who it seemed to me was being treated a lot worse than he deserved, and surprised that he kept at it.
The way that story ultimately turned out led me to decide that my model of what was going on was at least importantly incomplete, and quite possibly fundamentally wrongheaded, but I haven’t further refined that model.
I wonder how many other people feel this strongly about being criticized/insulted by a high status person (I guess at least Roko also felt strongly enough about being called “stupid” by Eliezer to contribute to him leaving this community a few days later), and whether Eliezer might not be aware of this effect he is having on others.
As a data point here I tend to empathize with the recipient of such barrages to what I subjectively estimate as about 60% of the degree of emotional affect that I would experience if it were directed at myself. Particularly if said recipient is someone I respect as much as Roko and when the insults are not justified—less if they do not have my respect and if the insults are justified I experience no empathy. It is the kind of thing that I viscerally object to having in my tribe and where it is possible I try to ensure that the consequences to the high status person for their behavior are as negative as possible—or at least minimize the reward they receive if the tribe is one that tends to award bullying.
There are times in the past—let’s say 4 years ago—where such an attack would certainly prompt me to leave a community, even if the community was otherwise moderately appreciated. Now I believe I am unlikely to leave over such an incident. I would say I am more socially resilient and also more capable as understanding social politics as a game and so take it less personally. For instance when received the more mildly expressed declaration from Eliezer “You are not safe to even associate with!” I don’t recall experiencing any flight impulses—more surprise.
I’m surprised and disconcerted that some people might be so afraid of being rebuked by Eliezer as to be reluctant to criticize/correct him even when such incontrovertible evidence is available showing that he’s wrong.
I was a little surprised at first too at reading of komponisto’s reticence. Until I thought about it and reminded myself that in general I err on the side of not holding my tongue when I ought. In fact, the character “wedrifid” on wotmud.org with which I initially established this handle was banned from the game for 3 months for making exactly this kind of correction based off incontrovertible truth. People with status are dangerous and in general highly epistemically irrational in this regard. Correcting them is nearly always foolish.
I must emphasize that part of my initial surprise at kompo’s reticence is due to my model of Eliezer as not being especially corrupt in this kind of regard. In response to such correction I expect him to respond positively and update. While Eliezer may be arrogant and a tad careless when interacting with people at times but he is not an egotistical jerk enforcing his dominance in his domain with dick moves. That’s both high praise (by my way of thinking) and a reason for people to err less on the side of caution with him and to take less personally any ‘abrupt’ things he may say. Eliezer being rude to you isn’t a precursor to him beating you to death with a metaphorical rock to maintain his power—as our instincts may anticipate. He’s just being rude.
I’m surprised and disconcerted that some people might be so afraid of being rebuked by Eliezer as to be reluctant to criticize/correct him even when such incontrovertible evidence is available showing that he’s wrong.
People have to realize that to critically examine his output is very important due to the nature and scale of what he is trying to achieve.
Even people with comparatively modest goals like trying to become the president of the United States of America should face and expect a constant and critical analysis of everything they are doing.
Which is why I am kind of surprised how often people ask me if I am on a crusade against Eliezer or find fault with my alleged “hostility”. Excuse me? That person is asking for money to implement a mechanism that will change the nature of the whole universe. You should be looking for possible shortcomings as well!
Everyone should be critical of Eliezer and SIAI, even if they agree with almost anything. Why? Because if you believe that it is incredible important and difficult to get friendly AI just right, then you should be wary of any weak spot. And humans are the weak spot here.
That’s why outsiders think it’s a circlejerk. I’ve heard of Richard Loosemore whom as far as i can see was banned over corrections on the “conjunction fallacy”, not sure what exactly went on, but ofc having spent time reading Roko thing (and having assumed that there was something sensible I did not hear of, and then learning that there wasn’t) its kind of obvious where my priors are.
Maybe try keeping statements more accurate by qualifying your generalizations (“some outsiders”), or even just saying “that’s why I think this is a circlejirk.” That’s what everyone ever is going to interpret it as anyhow (intentional).
Maybe you guys are too careful with qualifying everything as ‘some outsiders’ and then you end up with outsiders like Holden forming negative views which you could of predicted if you generalized more (and have the benefit of Holden’s anticipated feedback without him telling people not to donate).
Maybe. Seems like you’re reaching, though: Maybe something bad comes from us being accurate rather than general about things like this, and maybe Holden criticizing SIAI is a product of this on LessWrong for some reason, and therefore it is in fact better for you to say inaccurate things like “outsiders think it’s a circlejrik.” Because you… care about us?
You guys are only being supposedly ‘accurate’ when it feels good. I have not said, ‘all outsiders’, that’s your interpretation which you can subsequently disagree with.
SI generalized from the agreement of self selected participants, onto opinions of outsiders, like Holden, subsequently approaching him and getting back the same critique they’ve been hearing from rare ‘contrarians’ here for ages but assumed to be some sorta fringe views and such. I don’t really care what you guys do with this, you can continue as is and be debunked big time as cranks, your choice. edit: actually, you can see Eliezer himself said that most AI researchers are lunatics. What did SI do to distinguish themselves from what you guys call ‘lunatics’? What is here that can shift probabilities from the priors? Absolutely nothing. The focus on safety with made up fears is no indication of sanity what so ever.
You guys are only being supposedly ‘accurate’ when it feels good. I have not said, ‘all outsiders’, that’s your interpretation which you can subsequently disagree with.
You’re misusing language by not realizing that most people treat “members of group A think X” as “a sizable majority of members of group A think X”, or not caring and blaming the reader when they parse it the standard way. We don’t say “LWers are religious” or even “US citizens vote Democrat”, even though there’s certainly more than one religious person on this site or Democrat voter in the US.
And if you did intend to say that, you’re putting words into Manfred’s mouth by assuming he’s talking about ‘all’ instead.
I do think that ‘sizable majority’ hypothesis has not been ruled out, to say the least. SI is working to help build benevolent ruler bot, to save the world from malevolent bot. That sounds as crazy as things can be. Prior track record doing anything relevant? None. Reasons for SI to think they can make any progress? None.
I think most of sceptically minded people do see that kind of stuff in pretty negative light, but of course that’s my opinion, you can disagree. Actually, who cares, SI should just go on ‘fix’ what Holden pointed out, increase visibility, and get listed on crackpot/pseudoscience pages.
I’m not talking about SI (which I’ve never donated money to), I’m talking about you.
I can talk about you too. The statement “That’s why outsiders think it’s a circlejerk”, does not have ‘sizable majority’, or ‘significant minority’, or ‘all’, or ‘some’ qualifier, nor does it have any kind of implied qualifier, nor does it need qualifying with vague “some”, that is entirely needless verbosity (as the ‘some’ can range from 0.00001% to 99.999%), and the request to add “some” is clearly rhetorical, which we both realize equally well. (It is the case, though, that I think the most likely case is “significant majority of rational people”, i.e. i expect greater than 50% chance of strong negative opinion of SI if it is presented to a rational person).
And you’re starting to repeat yourself.
The other day someone told me my argument was shifting like wind.
I’m talking about you. And you’re starting to repeat yourself.
Does that mean it is time to stop feeding him?
I had decided when I finished my hiatus recently that the account in question had already crossed the threshold where I could reply to him without predicting that I was just causing more noise.
I wonder how many other people feel this strongly about being criticized/insulted by a high status person (I guess at least Roko also felt strongly enough about being called “stupid” by Eliezer to contribute to him leaving this community a few days later), and whether Eliezer might not be aware of this effect he is having on others.
I don’t feel insulted at all. He is much smarter than me. But I am also not trying to accomplish the same as him. If he calls me stupid for criticizing him, that’s as if someone who wants to become a famous singer is telling me that I can’t sing when I criticized their latest song. No shit Sherlock!
I’m surprised and disconcerted that some people might be so afraid of being rebuked by Eliezer as to be reluctant to criticize/correct him even when such incontrovertible evidence is available showing that he’s wrong. Your comment also made me recall another comment you wrote a couple of years ago about how my status in this community made a criticism of you feel like a “huge insult”, which I couldn’t understand at the time and just ignored.
I wonder how many other people feel this strongly about being criticized/insulted by a high status person (I guess at least Roko also felt strongly enough about being called “stupid” by Eliezer to contribute to him leaving this community a few days later), and whether Eliezer might not be aware of this effect he is having on others.
My brain really, really does not want to update on the numerous items of evidence available to it that it can hit people much much harder now, owing to community status, than when it was 12 years old.
(nods) I’ve wondered this many times.
I have also at times wondered if EY is adopting the “slam the door three times” approach to prospective members of his community, though I consider this fairly unlikely given other things he’s said.
Somewhat relatedly, I remember when lukeprog first joined the site, he and EY got into an exchange that from what I recall of my perspective as a completely uninvolved third party involved luke earnestly trying to offer assistance and EY being confidently dismissive of any assistance someone like luke could provide, and at the time I remember feeling sort of sorry for luke, who it seemed to me was being treated a lot worse than he deserved, and surprised that he kept at it.
The way that story ultimately turned out led me to decide that my model of what was going on was at least importantly incomplete, and quite possibly fundamentally wrongheaded, but I haven’t further refined that model.
As a data point here I tend to empathize with the recipient of such barrages to what I subjectively estimate as about 60% of the degree of emotional affect that I would experience if it were directed at myself. Particularly if said recipient is someone I respect as much as Roko and when the insults are not justified—less if they do not have my respect and if the insults are justified I experience no empathy. It is the kind of thing that I viscerally object to having in my tribe and where it is possible I try to ensure that the consequences to the high status person for their behavior are as negative as possible—or at least minimize the reward they receive if the tribe is one that tends to award bullying.
There are times in the past—let’s say 4 years ago—where such an attack would certainly prompt me to leave a community, even if the community was otherwise moderately appreciated. Now I believe I am unlikely to leave over such an incident. I would say I am more socially resilient and also more capable as understanding social politics as a game and so take it less personally. For instance when received the more mildly expressed declaration from Eliezer “You are not safe to even associate with!” I don’t recall experiencing any flight impulses—more surprise.
I was a little surprised at first too at reading of komponisto’s reticence. Until I thought about it and reminded myself that in general I err on the side of not holding my tongue when I ought. In fact, the character “wedrifid” on wotmud.org with which I initially established this handle was banned from the game for 3 months for making exactly this kind of correction based off incontrovertible truth. People with status are dangerous and in general highly epistemically irrational in this regard. Correcting them is nearly always foolish.
I must emphasize that part of my initial surprise at kompo’s reticence is due to my model of Eliezer as not being especially corrupt in this kind of regard. In response to such correction I expect him to respond positively and update. While Eliezer may be arrogant and a tad careless when interacting with people at times but he is not an egotistical jerk enforcing his dominance in his domain with dick moves. That’s both high praise (by my way of thinking) and a reason for people to err less on the side of caution with him and to take less personally any ‘abrupt’ things he may say. Eliezer being rude to you isn’t a precursor to him beating you to death with a metaphorical rock to maintain his power—as our instincts may anticipate. He’s just being rude.
People have to realize that to critically examine his output is very important due to the nature and scale of what he is trying to achieve.
Even people with comparatively modest goals like trying to become the president of the United States of America should face and expect a constant and critical analysis of everything they are doing.
Which is why I am kind of surprised how often people ask me if I am on a crusade against Eliezer or find fault with my alleged “hostility”. Excuse me? That person is asking for money to implement a mechanism that will change the nature of the whole universe. You should be looking for possible shortcomings as well!
Everyone should be critical of Eliezer and SIAI, even if they agree with almost anything. Why? Because if you believe that it is incredible important and difficult to get friendly AI just right, then you should be wary of any weak spot. And humans are the weak spot here.
That’s why outsiders think it’s a circlejerk. I’ve heard of Richard Loosemore whom as far as i can see was banned over corrections on the “conjunction fallacy”, not sure what exactly went on, but ofc having spent time reading Roko thing (and having assumed that there was something sensible I did not hear of, and then learning that there wasn’t) its kind of obvious where my priors are.
Maybe try keeping statements more accurate by qualifying your generalizations (“some outsiders”), or even just saying “that’s why I think this is a circlejirk.” That’s what everyone ever is going to interpret it as anyhow (intentional).
Maybe you guys are too careful with qualifying everything as ‘some outsiders’ and then you end up with outsiders like Holden forming negative views which you could of predicted if you generalized more (and have the benefit of Holden’s anticipated feedback without him telling people not to donate).
Maybe. Seems like you’re reaching, though: Maybe something bad comes from us being accurate rather than general about things like this, and maybe Holden criticizing SIAI is a product of this on LessWrong for some reason, and therefore it is in fact better for you to say inaccurate things like “outsiders think it’s a circlejrik.” Because you… care about us?
You guys are only being supposedly ‘accurate’ when it feels good. I have not said, ‘all outsiders’, that’s your interpretation which you can subsequently disagree with.
SI generalized from the agreement of self selected participants, onto opinions of outsiders, like Holden, subsequently approaching him and getting back the same critique they’ve been hearing from rare ‘contrarians’ here for ages but assumed to be some sorta fringe views and such. I don’t really care what you guys do with this, you can continue as is and be debunked big time as cranks, your choice. edit: actually, you can see Eliezer himself said that most AI researchers are lunatics. What did SI do to distinguish themselves from what you guys call ‘lunatics’? What is here that can shift probabilities from the priors? Absolutely nothing. The focus on safety with made up fears is no indication of sanity what so ever.
You’re misusing language by not realizing that most people treat “members of group A think X” as “a sizable majority of members of group A think X”, or not caring and blaming the reader when they parse it the standard way. We don’t say “LWers are religious” or even “US citizens vote Democrat”, even though there’s certainly more than one religious person on this site or Democrat voter in the US.
And if you did intend to say that, you’re putting words into Manfred’s mouth by assuming he’s talking about ‘all’ instead.
I do think that ‘sizable majority’ hypothesis has not been ruled out, to say the least. SI is working to help build benevolent ruler bot, to save the world from malevolent bot. That sounds as crazy as things can be. Prior track record doing anything relevant? None. Reasons for SI to think they can make any progress? None.
I think most of sceptically minded people do see that kind of stuff in pretty negative light, but of course that’s my opinion, you can disagree. Actually, who cares, SI should just go on ‘fix’ what Holden pointed out, increase visibility, and get listed on crackpot/pseudoscience pages.
I’m not talking about SI (which I’ve never donated money to), I’m talking about you. And you’re starting to repeat yourself.
I can talk about you too. The statement “That’s why outsiders think it’s a circlejerk”, does not have ‘sizable majority’, or ‘significant minority’, or ‘all’, or ‘some’ qualifier, nor does it have any kind of implied qualifier, nor does it need qualifying with vague “some”, that is entirely needless verbosity (as the ‘some’ can range from 0.00001% to 99.999%), and the request to add “some” is clearly rhetorical, which we both realize equally well. (It is the case, though, that I think the most likely case is “significant majority of rational people”, i.e. i expect greater than 50% chance of strong negative opinion of SI if it is presented to a rational person).
The other day someone told me my argument was shifting like wind.
Does that mean it is time to stop feeding him?
I had decided when I finished my hiatus recently that the account in question had already crossed the threshold where I could reply to him without predicting that I was just causing more noise.
Good point.
I don’t feel insulted at all. He is much smarter than me. But I am also not trying to accomplish the same as him. If he calls me stupid for criticizing him, that’s as if someone who wants to become a famous singer is telling me that I can’t sing when I criticized their latest song. No shit Sherlock!