Maybe try keeping statements more accurate by qualifying your generalizations (“some outsiders”), or even just saying “that’s why I think this is a circlejirk.” That’s what everyone ever is going to interpret it as anyhow (intentional).
Maybe you guys are too careful with qualifying everything as ‘some outsiders’ and then you end up with outsiders like Holden forming negative views which you could of predicted if you generalized more (and have the benefit of Holden’s anticipated feedback without him telling people not to donate).
Maybe. Seems like you’re reaching, though: Maybe something bad comes from us being accurate rather than general about things like this, and maybe Holden criticizing SIAI is a product of this on LessWrong for some reason, and therefore it is in fact better for you to say inaccurate things like “outsiders think it’s a circlejrik.” Because you… care about us?
You guys are only being supposedly ‘accurate’ when it feels good. I have not said, ‘all outsiders’, that’s your interpretation which you can subsequently disagree with.
SI generalized from the agreement of self selected participants, onto opinions of outsiders, like Holden, subsequently approaching him and getting back the same critique they’ve been hearing from rare ‘contrarians’ here for ages but assumed to be some sorta fringe views and such. I don’t really care what you guys do with this, you can continue as is and be debunked big time as cranks, your choice. edit: actually, you can see Eliezer himself said that most AI researchers are lunatics. What did SI do to distinguish themselves from what you guys call ‘lunatics’? What is here that can shift probabilities from the priors? Absolutely nothing. The focus on safety with made up fears is no indication of sanity what so ever.
You guys are only being supposedly ‘accurate’ when it feels good. I have not said, ‘all outsiders’, that’s your interpretation which you can subsequently disagree with.
You’re misusing language by not realizing that most people treat “members of group A think X” as “a sizable majority of members of group A think X”, or not caring and blaming the reader when they parse it the standard way. We don’t say “LWers are religious” or even “US citizens vote Democrat”, even though there’s certainly more than one religious person on this site or Democrat voter in the US.
And if you did intend to say that, you’re putting words into Manfred’s mouth by assuming he’s talking about ‘all’ instead.
I do think that ‘sizable majority’ hypothesis has not been ruled out, to say the least. SI is working to help build benevolent ruler bot, to save the world from malevolent bot. That sounds as crazy as things can be. Prior track record doing anything relevant? None. Reasons for SI to think they can make any progress? None.
I think most of sceptically minded people do see that kind of stuff in pretty negative light, but of course that’s my opinion, you can disagree. Actually, who cares, SI should just go on ‘fix’ what Holden pointed out, increase visibility, and get listed on crackpot/pseudoscience pages.
I’m not talking about SI (which I’ve never donated money to), I’m talking about you.
I can talk about you too. The statement “That’s why outsiders think it’s a circlejerk”, does not have ‘sizable majority’, or ‘significant minority’, or ‘all’, or ‘some’ qualifier, nor does it have any kind of implied qualifier, nor does it need qualifying with vague “some”, that is entirely needless verbosity (as the ‘some’ can range from 0.00001% to 99.999%), and the request to add “some” is clearly rhetorical, which we both realize equally well. (It is the case, though, that I think the most likely case is “significant majority of rational people”, i.e. i expect greater than 50% chance of strong negative opinion of SI if it is presented to a rational person).
And you’re starting to repeat yourself.
The other day someone told me my argument was shifting like wind.
I’m talking about you. And you’re starting to repeat yourself.
Does that mean it is time to stop feeding him?
I had decided when I finished my hiatus recently that the account in question had already crossed the threshold where I could reply to him without predicting that I was just causing more noise.
Maybe try keeping statements more accurate by qualifying your generalizations (“some outsiders”), or even just saying “that’s why I think this is a circlejirk.” That’s what everyone ever is going to interpret it as anyhow (intentional).
Maybe you guys are too careful with qualifying everything as ‘some outsiders’ and then you end up with outsiders like Holden forming negative views which you could of predicted if you generalized more (and have the benefit of Holden’s anticipated feedback without him telling people not to donate).
Maybe. Seems like you’re reaching, though: Maybe something bad comes from us being accurate rather than general about things like this, and maybe Holden criticizing SIAI is a product of this on LessWrong for some reason, and therefore it is in fact better for you to say inaccurate things like “outsiders think it’s a circlejrik.” Because you… care about us?
You guys are only being supposedly ‘accurate’ when it feels good. I have not said, ‘all outsiders’, that’s your interpretation which you can subsequently disagree with.
SI generalized from the agreement of self selected participants, onto opinions of outsiders, like Holden, subsequently approaching him and getting back the same critique they’ve been hearing from rare ‘contrarians’ here for ages but assumed to be some sorta fringe views and such. I don’t really care what you guys do with this, you can continue as is and be debunked big time as cranks, your choice. edit: actually, you can see Eliezer himself said that most AI researchers are lunatics. What did SI do to distinguish themselves from what you guys call ‘lunatics’? What is here that can shift probabilities from the priors? Absolutely nothing. The focus on safety with made up fears is no indication of sanity what so ever.
You’re misusing language by not realizing that most people treat “members of group A think X” as “a sizable majority of members of group A think X”, or not caring and blaming the reader when they parse it the standard way. We don’t say “LWers are religious” or even “US citizens vote Democrat”, even though there’s certainly more than one religious person on this site or Democrat voter in the US.
And if you did intend to say that, you’re putting words into Manfred’s mouth by assuming he’s talking about ‘all’ instead.
I do think that ‘sizable majority’ hypothesis has not been ruled out, to say the least. SI is working to help build benevolent ruler bot, to save the world from malevolent bot. That sounds as crazy as things can be. Prior track record doing anything relevant? None. Reasons for SI to think they can make any progress? None.
I think most of sceptically minded people do see that kind of stuff in pretty negative light, but of course that’s my opinion, you can disagree. Actually, who cares, SI should just go on ‘fix’ what Holden pointed out, increase visibility, and get listed on crackpot/pseudoscience pages.
I’m not talking about SI (which I’ve never donated money to), I’m talking about you. And you’re starting to repeat yourself.
I can talk about you too. The statement “That’s why outsiders think it’s a circlejerk”, does not have ‘sizable majority’, or ‘significant minority’, or ‘all’, or ‘some’ qualifier, nor does it have any kind of implied qualifier, nor does it need qualifying with vague “some”, that is entirely needless verbosity (as the ‘some’ can range from 0.00001% to 99.999%), and the request to add “some” is clearly rhetorical, which we both realize equally well. (It is the case, though, that I think the most likely case is “significant majority of rational people”, i.e. i expect greater than 50% chance of strong negative opinion of SI if it is presented to a rational person).
The other day someone told me my argument was shifting like wind.
Does that mean it is time to stop feeding him?
I had decided when I finished my hiatus recently that the account in question had already crossed the threshold where I could reply to him without predicting that I was just causing more noise.
Good point.