Article about LW: Faith, Hope, and Singularity: Entering the Matrix with New York’s Futurist Set
Faith, Hope, and Singularity: Entering the Matrix with New York’s Futurist Set
To my knowledge LessWrong hasn’t received a great deal of media coverage. So, I was surprised when I came across an article via a Facebook friend which also appeared on the cover of the New York Observer today. However, I was disappointed upon reading it, as I don’t think it is an accurate reflection of the community. It certainly doesn’t reflect my experience with the LW communities in Toronto and Waterloo.
I thought it would be interesting to see what the broader LessWrong community thought about this article. I think it would make for a good discussion.
Possible conversation topics:
This article will likely reach many people that have never heard of LessWrong before. Is this a good introduction to LessWrong for those people?
Does this article give an accurate characterization of the LessWrong community?
Edit 1: Added some clarification about my view on the article.
Edit 2: Re-added link using “nofollow” attribute.
- 1 Feb 2013 19:00 UTC; 44 points) 's comment on Open Thread, February 1-14, 2013 by (
- 14 Nov 2012 3:08 UTC; 14 points) 's comment on [Link] “An OKCupid Profile of a Rationalist” by (
- 30 Jul 2012 2:07 UTC; 5 points) 's comment on Open Thread, July 16-31, 2012 by (
- 24 Dec 2012 18:27 UTC; 1 point) 's comment on New censorship: against hypothetical violence against identifiable people by (
There are limited categories for groups to be placed in by the media: we scored ‘risque’ instead of ‘nutjob’, so this piece is a victory, I’d say.
I know that this article is more than a bit sensationalized, but it covers most of the things that I donate to the SIAI despite, like several members’ evangelical polyamory. Such things don’t help the phyg pattern matching, which already hits us hard.
The “evangelical polyamory” seems like an example of where Rationalists aren’t being particularly rational.
In order to get widespread adoption of your main (more important) ideas, it seems like a good idea to me to keep your other, possibly alienating, ideas private.
Being the champion of a cause sometimes necessitates personal sacrifice beyond just hard work.
Probably another example: calling themselves “Rationalists”
Yeah.
Seriously, why should anyone think that SI is anything more than “narcissistic dilettantes who think they need to teach their awesome big picture ideas to the mere technicians that are creating the future”, to paraphrase one of my friends?
This is pretty damn illuminating:
http://lesswrong.com/lw/9gy/the_singularity_institutes_arrogance_problem/5p6a
re: sex life, nothing wrong with it per se, but consider that there’s things like psychopathy checklist where you score points for basically talking people into giving you money, for being admired beyond accomplishments, and for sexual promiscuity also. On top of that most people will give you fuzzy psychopathy point for believing the AI to be psychopathic, because typical mind fallacy. Not saying that it is solid science, it isn’t, just outlining how many people think.
This doesn’t seem to happen when people note that when you look at corporations as intentional agents, they behave like human psychopaths. The reasoning is even pretty similar to the case for AIs, corporations exhibit basic rational behavior but mostly lack whatever special sauce individual humans have that makes them be a bit more prosocial.
Well, the intelligence in general can be much more alien than this.
Consider an AI that, given any mathematical model of a system and some ‘value’ metric, finds optimum parameters for object in a system. E.g. the system could be Navier-Stokes equations and a wing, the wing shape may be the parameter, and some metric of drag and lift of the wing can be the value to maximize, and the AI would do all that’s necessary including figuring out how to simulate those equations efficiently.
Or the system could be general relativity and quantum mechanics, the parameter could be a theory of everything equation, and some metric of inelegance has to be minimized.
That’s the sort of thing that scientists tend to see as ‘intelligent’.
The AI, however, did acquire plenty of connotations from science fiction, whereby it is very anthropomorphic.
Those are narrow AIs. Their behavior doesn’t involve acquiring resources from the outside world and autonomously developing better ways to do that. That’s the part that might lead to psychopath-like behavior.
Specializing the algorithm to outside world and to particular philosophy of value does not make it broader, or more intelligent, only more anthropomorphic (and less useful, if you dont believe in friendliness).
The end value is still doing the best possible optimization for the parameters of the mathematical system. There are many more resources to be used for that in the outside world than what is probably available for the algorithm when it starts up. So the algorithm that can interact effectively with the outside world may be able to satisfy whatever alien goal it has much better than one who doesn’t.
(I’m a bit confused if you want the Omohundro Basic AI Drives stuff explained to you here or if you want to be disagreeing with it.)
Having specific hardware that is computing an algorithm actually display the results of computation in specific time is outside the scope of ‘mathematical system’.
Furthermore, the decision theories are all built to be processed using the above mentioned mathematics-solving intelligence to attain real world goals, except defining real world goals proves immensely difficult. edit: also, if the mathematics solving intelligence was to have some basic extra drives to resist being switched off and such (so that it could complete its computations), then the FAI relying on such mathematics solving subcomponent would be impossible. The decision theories presume absence of any such drives inside their mathematics processing component.
If the sufficiently advanced technology is indistinguishable from magic, the arguments about “sufficiently advanced AI system” in absence of actual definition what it is, are indistinguishable from magical thinking.
That sentence is magical thinking. You’re equating the meaning of the word “magic” in Clarke’s Law and in the expression “magical thinking”, which do not refer to the same thing.
I thought the expression ‘magical thinking’ was broad enough as to include fantasising about magic. I do think though that even in the meaning of ‘thinking by word association’ it happens a whole lot with futurism also, when the field is ill specified and the collisions between model and world are commonplace (as well as general confusion due to lack of specificity of the terms).
Ok, then, so the actual problem is that the people who worry about AIs that behave psychopatically have such a capable definition for AI that you consider them basically speaking nonsense?
The “sufficiently advanced” in their argumentations means “sufficiently advanced in the direction of making my argument true” and nothing more.
If I adopt pragmatic version of “advancedness”, then the software (algorithms) that are somehow magically made to* self identify with it’s computing substrate, is less advanced, unless it is also friendly or something.
we don’t know how to do that yet. edit: and some believe that it would just fall out of general smartness somehow, but i’m quite dubious about that.
“evangelical polyamory”
Very much agree with this in particular.
Who’s being evangelical about it?
Maybe the word “evangelical” isn’t strictly correct. (A quick Google search suggests that I had cached the phrase from this discussion.) I’d like to point out an example of an incident that leaves a bad taste in my mouth.
This comment was made by Eliezer under the name of this community in the author’s notes to one of LessWrongs’s largest recruiting tools. I remember when I first read this, I kind of flipped out. Professor Quirrell wouldn’t have written this, I thought. It was needlessly antagonistic, it squandered a bunch of positive affect, there was little to be gained from this digression, it was blatant signaling—it was so obviously the wrong thing to do and yet it was published anyway.
A few months before that was written, I had cut a fairly substantial cheque to the Singularity Institute. I want to purchase AI risk reduction, not fund a phyg. Blocks of text like the above do not make me feel comfortable that I am doing the former and not the later. I am not alone here.
Back when I only lurked here and saw the first PUA fights, I was in favor of the PUA discussion ban because if LessWrong wants to be a movement that either tries to raise the sanity waterline or maximizes the probability of solving the Friendly AI problem, it needs to be as inclusive as possible and have as few ugh fields that immediately drive away new members. I now think an outright ban would do more harm than good, but the ugh field remains and is counterproductive.
[d1]: http://lesswrong.com/lw/9kf/ive_had_it_with_those_dark_rumours_about_our/5raj
When you decide to fund research, what are your requirements for researchers’ personal lives? Is the problem that his sex life is unusual, or that he talks about it?
My biggest problem is more that he talks about it, sometimes in semiofficial channels. This doesn’t mean that I wouldn’t be squicked out if I learned about it, but I wouldn’t see it as a political problem for the SIAI.
The SIAI isn’t some random research think tank: it presents itself as the charity with the highest utility per marginal dollar. Likewise, Eliezer Yudkowsky isn’t some random anonymous researcher: he is the public face of the SIAI. His actions and public behavior reflect on the SIAI whether or not it’s fair, and everyone involved should have already had that as a strongly held prior.
If people ignore lesswrong or don’t donate to the SIAI because they’re filtered out by squickish feelings, then this is less resources for the SIAI’s mission in return for inconsequential short term gains realized mostly by SIAI insiders. Compound this that talking about the singularity already triggers some people’s absurdity bias; there needs to be as few other filters as possible to maximize usable resources that the SIAI has to maximize the chance of positive singularity outcomes.
It seems there are two problems: you trust SIAI less, and you worry that others will trust it less. I understand the reason for the second worry, but not the first. Is it that you worry your investment will become worth less because others won’t want to fund SIAI?
That talk was very strong evidence that the SI is incompetent at PR, and furthermore, irrational. edit: or doesn’t possess stated goals and beliefs. If you believe the donations are important for saving your life (along with everyone else’s), then you naturally try to avoid making such statements. Though I do in some way admire straight up in your face honesty.
My feelings on the topic are similar to iceman’s, though possibly for slightly different reasons.
What bothers me is not the fact that Eliezer’s sex life is “unusual”, or that he talks about it, but that he talks about it in his capacity as the chief figurehead and PR representative for his organization. This signals a certain lack of focus due to an inability to distinguish one’s personal and professional life.
Unless the precise number and configuration of Eliezer’s significant others is directly applicable to AI risk reduction, there’s simply no need to discuss it in his official capacity. It’s unprofessional and distracting.
(in the interests of full disclosure, I should mention that I am not planning on donating to SIAI any time soon, so my points above are more or less academic).
On the other hand—while I’m also worried about other people’s reaction to that comment, my own reaction was positive. Which suggests there might be other people with positive reactions to it.
I think I like having a community leader who doesn’t come across as though everything he says is carefully tailored to not offend people who might be useful; and occasionally offending such people is one way to signal being such a leader.
I also worry that Eliezer having to filter comments like this would make writing less fun for him; and if that made him write less, it might be worse than offending people.
I can only give you one upvote, so please take my comment as a second.
Agreed. I don’t want to have to hedge my exposure to crazy social experiments; I want pure-play Xrisk reduction.
For a little historical perspective, here are some examples of journalistic coverage of Extropians in the 1990s: June 1993. October 1994. March 1995. April 1995. Also 1995.
Transhumanism seems to hold people’s interests until around the time they turn 40, when the middle-aged reality principle (aging and mortality) starts to assert itself. It resembles going through a goth phase as a teenager.
BTW, I find it interesting that Peter Thiel’s name keeps coming up in connection with stories about the singularity, as in the New York Observer one, when he has gone out of his way lately to argue that technological progress has stagnated. Thiel has basically staked out an antisingularity position.
That position is “antisingularity” only in the Kurzweilian sense of the word. I wouldn’t be surprised if e.g. essentially everyone at the Singularity Institute were “antisingularity” in this sense.
Yep.
Saying that we haven’t made much progress recently isn’t the same as not wanting a positive singularity event. These are orthogonal. Thiel has directly supported singularity related organizations and events, while also being pessimistic on our technology progress. These are most certainly related.
I think this article is exceptionally nice for a hit piece on us by a gossip rag. Take it for what it is, let it go, give up on it, and don’t waste your time debating it.
Just read the article. I thought it was very nice! It takes us seriously, it accurately summarizes many of the things that LWers are doing and/or hope to do, and it makes us sound like we’re having a lot of fun while thinking about topics that might be socially useful while not hurting or threatening anyone. How could this possibly be described as trolling? I think the OP should put the link back up—the Observer deserves as much traffic as we can muster, I’d say.
At my middle school there was a sweet kid who had probably had pretty serious Aspergers. He was teased quite a bit but often it would take him a while to figure out that the other kids were sarcastically mocking him and not being friendly. He’d pick up on it eventually but by then he had replied the wrong way and looked stupid, leading to even more teasing.
No offense, but if you thought the article was taking us seriously you are somewhat socially tone-deaf.
I wouldn’t say it was taking us seriously, but journalists of this type tend not to take anything “seriously”. Only “hard-news” journalists write in a style that suggests their subjects are of status equal to or higher than their own.
I think many are failing to appreciate just how much respect is shown by the fact that almost nothing in the piece is false. That’s an incredible achievement for a fluff journalist writing about...pretty much anything, let alone this kind of subject matter.
The Observer isn’t the Times… but it also isn’t the Inquirer or World Net Daily. But your point is taken. Still, while you’re going to get the “wow, these people sure are weird” reaction no matter what but what you want is a ”… but maybe have they have point” graf or at least not get called “unhinged”. I don’t really have anything against the writer—she does what she does well (the writing is really excellent, I think). And I do think she probably likes the Less Wrong crowd she met. But I think it made the image problem really clear and explicit.
No offense taken! I was that kid in middle school, but I’ve grown a lot since then. I’ve learned to read people very well, and as a result I’ve been able to win elections in school clubs, join a fraternity, date, host dinner parties, and basically have a social life that’s as active and healthy as anyone else’s.
I think often we assume that people are criticizing us because we are starting out from a place of insecurity. If you suspect and worry and fear that you deserve criticism, then even a neutral description of your characteristics can feel like a harsh personal attack. It’s hard to listen to someone describe you, just like it’s hard to listen to an audiotape of your voice or a videotape of your face. We are all more awkward in real life than we imagine ourselves to be; this is just the corollary of overconfidence/optimism bias, which says that we predict better results for ourselves than we are likely to actually obtain. It’s OK, though. Honest, neutral feedback can be uncomfortable to hear, and still not be meant as criticism, much less as trolling.
Are there thousands of narrow-minded people who will read the article and laugh and say, “Haha, those stupid Less Wrongers, they’re such weirdos?” Of course. But I don’t think you can blame the journalist for that—it’s not the journalist’s job to deprive ignorant, judgy readers of any and all ammunition, and, after all, we are a bit strange. If we weren’t any different from the mainstream, then why bother?
I’m not blaming the journalist. The problem is that the image that was projected (and I’m not close enough to the situation to be comfortable attributing any blame, thus the passive voice) wasn’t worth taking seriously.
In the article’s comments, the author states that she “found the people [she] met and talked to charming, intelligent, and kind.”
Which a) is a perspective that could have shown through a bit more in the article and b) is entirely independent of whether or not she or the article takes Less Wrong or SI seriously.
But I did read that earlier and mellowed a bit. Again, I don’t fault a gossip writer for writing gossip. That’s a separate question from whether or not the story counts as good press.
Writers that tend to get articles published in popular magazines tend to write things that people that read popular magazines tend to want to read. This may or may not be identical to the actual beliefs or feelings of the author.
It is also not an accurate depiction of the community in London or Edinburgh (UK). However, I think it is pretty close to exactly what I would expect a tabloid summary of the Berkeley community to look like, based on my personal experience. The communities in Berkeley and NY really are massively different in kind to those pretty much anywhere else in the world (again, from personal experience).
And, as Kevin says, it is remarkably nice—they could have used exactly the same content to write a much more damning piece.
The welcome thread indicates at least one person has joined because of the article.
Some people would say that if a New York City gossip rag thinks that expressing opinions about your sex life will sell magazines, that means you’ve “arrived” or some such. Still, it can’t be particularly comfortable for the people named. :-(
Saying Eliezer has (or had) an “IQ of 143” is a bit silly—to be blunt: who cares? Maybe it was contextualized in some way and then the context got edited out? Dunno. By comparison, characterizing him as messianic is down-to-earth and relevant :D
And boy howdy, this gal was interested in our sex lives.
Girl. But yes.
Thanks, fix’d,
Wouldn’t want to misrepresent anything now.
I’m really glad that the line about EY’s IQ links to the video in which EY makes that claim—I can’t conceive that anyone could watch that video all the way through and come away with the impression that EY is a phyg leader.
She could have used this source material.
About that Communications Director...
Though it’s possible the reporter has twisted your words more than I manage to suspect, I’ll say:
Wow, some of the people involved really suck at thinking (or caring to think) about how they make the scene look. I think I’m able to pretty well correct for the discrepancy between what’s reported and what’s the reality behind it, but even after the correction, this window into what the scene has become has further lowered my interest in flying over there to the States to hang out with you, since it seems I might end up banging my head against the wall in frustration for all the silliness that’s required for this sort of reporting to get it’s source material.
(Though I do also think that it’s inevitable that once the scene has grown to be large and successful enough, typical members will be sufficiently ordinary human beings that I’d find their company very frustrating. Sorry, I’m a dick that way, and in a sense my negative reaction is only a sign of success, though I didn’t expect quite this level of success to be reached yet.)
(By the previous I however do not mean to imply that things would have been saner 10 years ago (I certainly had significant shortcomings of my own), but back when nobody had figured much anything out yet or written Sequences about stuff, the expected level of insanity would have been partly higher for such reasons.)
This was a private party announced via a semi-public list. A reporter showed up and she talked to people without telling them she was a reporter. This is not a report, it is a tabloid piece. Intentional gossip.
Or, contrariwise, scandal-sheet reporters are good at making people look scandalous?
(Don’t think of a beautiful blue beetle.)
My experience with the NY Less Wrong group, of which I had been a part, is that we are, indeed, a bunch of silly people who like to do things that are silly, such as cuddle-piling, because they’re fun to do and we don’t care that much about appearing dignified in front of each other. If silliness bothers you, then you might very well be right in concluding that you wouldn’t enjoy hanging out with them in person.
D’you think? You’ll understand better after being reported-on yourself; and then you’ll look back and laugh about how very, very naive that comment was. It’s the average person’s incomprehension of reporter-distorting that gives reporters their power. If you read something and ask, “Hm, I wonder what the truth was that generated this piece?” without having personal, direct experience of how very bad it is, they win.
I think the winning move is to read blogs by smart people, who usually don’t lie, rather than anything in newspapers.
Actually, I feel that I have sufficient experience of being reported on (including in an unpleasant way), and it is precisely that which (along with my independent knowledge of many of the people getting reported on here) gave me the confidence to suspect that I would have managed to separate from the distortions an amount of information that described reality.
That said, there is a bit of fail with regard to whether I managed to communicate what precisely impacted me. Much of it is subtle, necessarily, since it had to be picked up through the distortion field, and I do allow for the possibility that I misread, but I continue to think that I’m much better at correcting for the distortion field than most people.
One thing I didn’t realize, however, is that you folks apparently didn’t think the gal might be a reporter. That’s of course a fail in itself, but certainly a lesser fail than behaving similarly in the presence of a person one does manage to suspect to be a reporter.
Just for fun, here’s my villification at the hands of the tabloid press. Naturally the majority of it is rubbish. It’s striking how they write as if they hadn’t spoken to us, when we actually spoke to them at length. For one thing they could have asked us if we were students—we weren’t...
That is just blatant. It’s like a parody of bad journalism.
Today I went to show this to a friend. I remembered reading a more detailed version of the story somewhere and after some searching I found the copy hosted by the good folks at archive.org, which I’m posting here for reference: “How To Be Notorious or Attack of the Tripehounds”
Oh that’s great! Thank you arundelo, thank you Wayback Machine!
What sample size are you generalizing from?
My personal experience is that I have been reported on in a personal capacity zero times. I’ve had family members in small human-interest stories twice that I recall off hand. I’ve read stories about companies I worked for and had detailed knowledge of the material being reported on several times; I don’t have an exact number.
My experience with those things does not line up with yours. I conclude from this that the normal variance of reporting quality is higher than either of us has personal experience with.
Data point: I was reported-on three times, by a serious newspaper. Most information was wrong or completely made up. Luckily, once they forgot to write my name, and once they wrote it wrong, so it was easier for me to pretend that those two articles were not about me.
(I’m assuming that your complaint is about the interview quality on LW topics, rather than the physical intimacy, which we can assume is present but was amplified in the writing process. Honestly there are several things I think your comment could be about, so fortunately my problems with it are general)
I think this comment is uncharitable. Which you kind of knew already. And which, by itself, isn’t so bad.
But unfortunately, you fall into the fundamental attribution error here, and explain other peoples’ failings as if they were inherent properties of those people—and not only do you mentally assign people qualities like “sucks at explaining,” you generalize this to judge them as whole people. Not only is this a hasty conclusion, but you’re going to make other people feel bad, because people generally don’t like being judged.
I can understand the impulse “I would have done much better,” but I would much rather you kept things constructive.
The starting point for my attitude was people doing things like intervening in front of a reporter to stop discussion of a topic that looks scandalous, or talking about Singularity/AI topics in a way that doesn’t communicate much wisdom at all.
Being silly with regard to physical intimacy and in general having a wild party is all well and good by itself, if you’re into that sort of thing, but I react negatively when that silliness seems to spill over into affecting the way serious things are handled.
(I’ll partly excuse being light on the constructiveness by having seen some copy-pastes that seem to indicate that what I’m concerned about is already being tackled in a constructive way on the NYC mailing list. The folks over there are much better positioned to do the contructive things that should be done, and I wasn’t into trying to duplicate their efforts.)
Eliezer’s OkCupid profile (as included in the titular link) provides some really good dating advice!
Here’s some dating advice: Don’t use the sentence “you shouldn’t worry about disqualifying yourself or thinking that I’m not accessible to you” anytime anywhere, let alone on your dating profile. Some career advice: If your day job supposedly involves ethics, you should probably tone down the publicly-available dating profile where you advertise yourself as a polyamorous sadist who welcomes the casual advances of women who “want to sleep with me once so you can tell your grandchildren” (provided they don’t “disqualify” themselves by thinking you’re not “accessible”, I suppose).
I’m hoping the whole thing is tongue-in-cheek...? (If so, it’s merely the product of poor judgment, rather than terrifying.)
If Eliezer ever does a complete reversal of his ethical position and starts advocating the 3^^^^3 dust-specks over the quantitatively negligible torture because “I mean wow doesn’t that just turn you on?” I’ll start to be concerned.
Career advice, simplified: If your day job requires having a good image, you should care about having a good image.
(Note: It’s not about ethics, only about perceptions. But the perceptions are important.)
Feynman used to hang in topless bars, didn’t he?
Did he also believe that he increases the existential risk by doing so?
Good point.
It seems to me that EY does not need your dating advice.
Why is it bad ethics? And why is it bad for EY’s career? He does not seem to be interested in soliciting donations from social conservatives.
It’s not so much the content as the presentation. The tone is incredibly self-absorbed and condescending. I thought the whole thing was a joke until I encountered the above quoted paragraph with its apparent sincerity. Presumably some of the content is intended to be tongue-in-check and some of it posturing, but it’s difficult to separate. There’s a compounding weirdness to the whole thing. Fetishes or open relationships or whatever aren’t in themselves causes for concern but when somebody is trying to advocate for rationalism and a particular approach to ethics, the sense that you’re following them somewhere very strange isn’t good to have.
Let me try to make that clearer: Utilitarianism already has the problem of frequently sounding as if sociopaths are discussing ethics as something entirely abstract. Applying that to relationships, in the form of evangelical polyamory, takes it to another level of squeamishness (as others here have indicated). Seeing those ideas put into practice in the context of the dating profile of a self-professed sadist (who has been accused of wanting to take over the world, no less), replete with technical terminology (“primary”, “dance card”, etc), condescending advice to prospective conquests to help them overcome their fear of rejection and a general tone of callousness, sends it over the edge. Read straight, the profile could almost serve as a reductio for SIAI-brand ethics and rationality.
I’m also worried about who the intended audience is. Since I can’t imagine anyone not deeply immersed in the Less Wrong community responding positively to it, I was left with the sense that perhaps our community’s figurehead is (ab)using his position in ways that, as some else put it, “don’t help the phyg pattern matching.” It’s basically an advertisement saying, “I’m a leader in small community x and I’m open to your sexual advances, so don’t be shy.”
And the problem with this is what, exactly? AFAIK, that’s simply the male equivalent to a cleavage photo.
This bit is quite similar to the rest of your comment: a denotative description with negative connotation, but lacking in any explanation for the connotation applied.
More precisely your criticism appears to all be of the form, “this is weird, and weird is bad.” There isn’t any explanation of “bad”, not even bad for whom or what goals, let alone how it is expected to be bad.
Less Wrong is already weird enough without the blatant weirdness in EY’s OKCupid profile. I’m seriously disappointed and worried by the fact that it’s still public, to be honest...
I think we’re all committing the typical mind fallacy by assuming that random other people are like us in that they’ll actually evaluate the ideas behind something instead of just superficially judging the people describing the ideas. Yes, we should try to get people to actually evaluate ideas as much as possible, but we should also try to appear as normal as possible for people who don’t instinctively actually evaluate ideas. See http://www.overcomingbias.com/2012/01/dear-young-eccentric.html
As far as I can tell, a large part of the reason PR departments exist in the first place is to control superficial impressions. I think this sends a bad superficial impression (and possibly even a worrisome non-superficial impression, i.e. on reflection maybe we don’t want to have someone who would write what EY wrote as a high-status figure in the aspiring rationalist community).
The latter is a somewhat stronger signal in as much as it is hard to fake. You have to have cleavage if you wish to show it off in a crudely overt way. Writing that you have status requires nothing.
Push-up bras. Photoshop. Or even uploading a picture of someone else.
I can’t imagine someone with an IQ of 90 able to come up with what EY wrote. Even the lack of spelling or grammar errors would be unusual for such a person. And his position within SIAI is easily googleable.
Pjeby was referring to a specific, fairly simple sentence. The most complex part was the single comma. The sentence is rather less impressive than even moderately endowed cleavage displays.
I agree that the overall profile is a strong signal. If I recall correctly I described it in a cousin comment as an approximately optimal combination of signalling and screening given Eliezer’s strengths and weaknesses. Someone else attempting to convey the same message would require non-trivial amounts of intelligence and an awful lot of familiarity with Eliezer’s culture.
As someone who had read Eliezer’s OkCupid profile sometime not very recently, I was actually gonna reply to this with something like “well, scientism goes maybe a bit too far, but he does actually have a point”
...but then I just went and reread the OkCupid profile, and no, actually it’s wonderfully funny and I have no worries similar to scientism’s, unlike earlier when the profile didn’t explicitly mention sadism.
Obviously Eliezer is a very unusual and “weird” person, but the openness about it that we observe here is a winning move, unlike the case where one might sense that he might be hiding something. Dishonesty and secrecy is what the evil phyg leaders would go for, whereas Eliezer’s openness invites scrutiny and allows him to emerge from it without the scrutinizers having found incriminating evidence.
Also, where are you seeing evangelical polyamory? I’m very much not polyamorous myself, and haven’t ever felt that anyone around here would be pushing polyamory to me.
I was going to post: “What makes it evangelical polyamory as opposed to just plain old polyamory?”
It seems to me the “evangelical” part was just added to make it seem worse without actually giving any valid reasons.
I think there’s a strong effect wherein “open non-ashamed polyamory wherein you mention any positive reason why you like it” = “evangelical polyamory”, or even just “open polyamory” = “evangelical polyamory”, for the same reasons as “evangelical atheism” and “evangelical homosexuality”.
As hard as you have tried to misrepresent the profile through cherrypicking it still doesn’t sound so bad. Eliezer’s profile is actually a close-to-optimal combination of signalling and screening for someone of Eliezer’s strengths and weaknesses. His advice is good, yours is bad—or at least naive and poorly generalised from advice that would be useful for PUA amateurs with a very specific persona and target audience in mind.
An excellent conclusion. I almost quoted that too (but chose to emphasize the advice part instead.)
Terrifying? I don’t believe you. I believe this was just a word that sprung to mind when you searched for “word with negative connotations that I can use to attempt to discredit Eliezer”. Pjeby’s commentary of your reply seems spot on.
I suspect that a lot the disagreement in this thread actually stems from what sets off peoples’ “squick” reflexes and how strong the reaction is in different individuals. It seems like you and pjeby don’t get a strong “squick” reaction from what Eliezer wrote on his profile, whereas scientism does. Compare scientism’s and pjeby’s reactions—scientism calls the profile “a new level of squeamishness,” where pjeby says that this description is “lacking in any explanation for the connotation applied.” To people like scientism, it feels obvious that this kind of squickiness is just bad and ugly-looking, but to yourself and pjeby, it doesn’t seem so apparent.
Stepping down from the meta-level and returning to the original point: I don’t think “terrifying” is necessarily hyperbole. Some people do actually react so strongly to squick that it makes them physically uncomfortable, and uncomfortableness (to whatever degree) is probably what motivated the arguments scientism made, especially the ones that you consider harsh or misrepresentational. (Note that this isn’t a defense of those arguments, just speculation about their origin and why you don’t agree with them.)
To be clear: I haven’t actually read the profile, only the excerpts posted here. But I’m quite confused as to why Eliezer openly stating an interest in sadism or polyamory would be a problem in any event.
Rationally speaking, the best way to find a partner with matching preferences is to be open and upfront about what it is your preferences are, just in case your potential partner(s) aren’t being upfront enough for you to find them.
What’s more, societal double standards being what they are, it’s generally less costly for a male to state his preferences up front than for a female… not to mention that it’s time-saving for all the females who don’t share his preferences. Frankly, being as honest and upfront as possible is an altruistic and highly ethical stance to take, because it benefits all of the women who view a person’s profile with an eye to dating its author.
And that’s is why I’m so utterly baffled by the mudslinging that seems to imply it’s, um, unethical? or something.
Edited to add: Just read the actual profile, and I am now updated even more in the direction that I have no clue WTF people are thinking. The sexual bits seem pure and innocent as the driven snow (at least to my own corrupt mind), and even the excessive citation of other people’s finding him impressive came off more as insecurity than arrogance. WTF are people complaining about, besides, “some people won’t like it”? “Some people won’t like it” is a fully general counterargument against doing or saying anything, anywhere, anytime, ever.
I think you’re barking up the wrong tree by engaging the specific arguments scientism is making rather than looking at what motivated these arguments in the first place. See this comment by scientism:
I don’t think this is actually about Eliezer’s preferences being unethical or anything like that, and it’s certainly not about broader things like the optimal way of finding a partner with matching preferences. This is about a subset of the population reading Eliezer’s profile and thinking oh god this is weird ew ew get it away from me ick ick ick and then writing these kinds of arguments as a reaction. This might sound like I’m being uncharitable to scientism, accusing him/her of giving nontrue rejections, but I think it’s accurate because a) the comment I quoted about indicates that this is largely about squick rather than ethics, and b) I also experienced a very strong squick reaction upon reading Eliezer’s profile, so I’m somewhat sympathetic to freaking out about it.
I think I’ve made it clear that I don’t find offence in any of the particular lifestyle choices expressed in the profile (i.e., sadomasochism and polyamory), but I think it’s more than an issue of mere presentation or the squick factor. My point is that the profile offers some insight into where following LW/SIAI/CFAR recommendations might take you. When somebody sets themselves up as an ethics and rationality expert their own lifestyle and character are going to be subject to especial scrutiny and rightly so. That isn’t to say that people should be alarmed at sadomasochism or polyamory; what I tried to convey was that everything together—the quirks, the presentation, the personality—painted a picture of something altogether bizarre. That combined with the fact that this person is offering advice on how to live your life was the source of potential terror.
Based on your previous comment, I had guessed that you were squicked out by the presentation rather than Eliezer’s actual lifestyle choices; thank you for clarifying. As I indicated above, I had a similar emotional reaction to the presentation.
I’m curious as to what underlying psychological factors caused us to react this way, and what subset of the population would also feel this kind of squick.
I guess what I want to emphasise is that I don’t think the reaction is illicit or even particularly subjective. One of the ways a system of ethics can fail is that it’s impoverished. It doesn’t capture everything we want to say about the subject. When you encounter a person or group who are living their life according to a particular ethical system and you have the sense of things spiralling away from normalcy, that’s a legitimate cause for concern. It’s a sense that something might be missing here. That’s why I said it could almost serve as a reductio. It’s similar to performing a long calculation and being left with the sense that the answer is an order of magnitude out.
To me, what society considers “normal” is terribly unethical, so “spiraling away from normalcy” isn’t a cause for concern, but perks my curiosity.
“Maybe he’s on to something...”
Imagine replacing the polyamory with homosexuality, and imagine it is a few decades ago when homosexuality was as risque as polyamory is currently. Do you have the same reaction? If not, what is different? If so, do you condone that reaction?
There’s a historical parallel there. In the earlier 20th century the followers of GE Moore’s system of ethics were alleged to have had non-standard relationships and practiced “evangelical” homosexuality. No doubt they were right to challenge the social mores of their day but I also think it would be fair to say that their lifestyles in total signalled an impoverished ethical system (in this case one dedicated to aesthetic pleasure). Obviously you can have good and bad reasons for doing anything. I’ve seen posts on LW about “polyhacking” (ridding oneself of sexual jealousy) and intentionally opening oneself up to same-sex relationships. I take no issue with any of this except that people might be doing them for bad reasons and that if somebody is engaged in a lot of this kind of thing it can be reason to ask whether their goals got confused somewhere along the way.
Agreed.
I would also say the same thing about someone who spends a lot of time trying to conform to mainstream sexual or relationship norms.
Of course, figuring out what my society wants from me (sexually, romantically, or in any other area) and arranging my life so I provide it isn’t necessarily problematic, any more than figuring out what I enjoy (ibid) and arranging my life to provide me with more of it is. But if I’m doing either to the significant exclusion of pursuing other things I value, I’ve gotten off track.
That said, I’ve noticed lots of people tend to notice (or at least point out) that truth differentially when the derailing force is a non-mainstream activity.
You seem to be saying that the fact that “a picture of something altogether bizarre” was painted has something particular to do with the LW community — that there is something that the LW community could have covered up, or done differently, that would have prevented that picture from being painted.
But the writer in question is in the business of gossip-mongering: providing entertainment in the form of bizarre pictures of social groups. This is not a truth-tracking endeavor. An effective gossip-monger can find something kinky and kooky about any group deemed sufficiently important to write about. Moreover, hiding your bi-poly-switch-trans-cuddle-nudist tendencies is not effective against gossip-mongers: if they can’t call you an oversexed pervert, they will call you a sexually-repressed virgin who can’t get laid.
I assumed the reasoning went “A preference for sexually masochistic mates is inherently evil. Eliezer expressed such a preference therefore he is confessing to be unethical.”
I really don’t think this is what scientism is actually arguing—what makes you think that? Also, see my reply to pjeby—I see this as being about tone rather than moral arguments.
Because that is the actual implied meaning of the argument. (Substitute “accepted by all relevant parties to be” for “inherently” if you prefer.) Of course he didn’t make it explicit and instead kept it in the realm of connotation. That’s what you are supposed to do when moralizing—especially when your moralizing makes no sense.
If you reject the above as the intended argument then all you achieve is changing the interpretation from “coherent argument based on ridiculous premises” to “no argument whatsoever”. Hardly an improvement.
Pjeby similarly described scientism’s comment as being devoid of anything but negative tone.
I had intended to shift the discussion to the emotional reaction that created the argument. If a subset of the population responds to certain things with such a strong emotional reaction, then this may be worth talking about, even if the arguments scientism used when expressing this emotion aren’t.
I agree with everything else you said.
I agree with what you say here too.
Yes, it was actually shockingly good presentation, although I guess I shouldn’t be surprised after reading HPMOR.
According to OkC, EYudkowsky would be an 86% match with me. Wow. (But we’re both straight guys and we’re 10,035 km apart.)
Match percentage ignores gender/distance. It’s entirely based on how you answer questions and weight them. Eliezer is a 99 percent match with me.
Yes, I know. (BTW, much (most?) of that 14% is due to our different attitudes with alcohol.)
I’m still waiting for some social scientist to write their thesis on us, as Elana Clift did for pick up artists.
The impression I would get from the article is that they are a bunch of weirdo techno-hippies, similar to the derogatory term “Yudkowskians” one occasionally finds online.
Someone in the future is going to read one of the social interaction scenes in a Charles Stross novel and marvel at the eerily accurate depiction...
This community came with much more ad hoc strange habits than newbies suppose. If journalists talk innacurate descriptions of that habits, then policing their apparition here is justified. But, I remember one strategical point of the median term goals: public image. If SI don’t have stomach to consider lowest quality external talks, or being minimal argumentative, then it’s time to be not so public. Maybe LW + HPMOR is sufficient.
Done.
It would be nice if you removed yours now since you aren’t able to use the attribute in your comment.
Thanks!
Actually, comment links are automatically nofollow, with no control by the commenter.
I really hate when writers mock purely through framing and quotations. It just becomes a straw-man/out of context game. If you’re going to editorialize, at least do it openly.
The article is obviously embarrassing to E.Y. If he didn’t want to see this essay’s Google rating improve, it wasn’t about some general principle regarding “trolling.” That’s a pretty pathetic attempt at an excuse. It was something about this article. But what? Everyone thinks it’s the “moral” aspect. That may be part of his worry: if so, it suggests that the SIAI/Less Wrong complex has a structure of levels—like say, Scientology—where the behavior of the more “conscious” is hidden from less-conscious followers.
But let me point out a specific revelation, not so prominent in the article but really more important for assessing SIAI and LW.
How do more leftwing members of the SIAI establishment feel about building an organization funded by (to realists, read “controlled by”) an ultrarightwing billionaire? (It raises questions like is the “politics is mindkiller” trope in place to avoid alienating Mr. Thiel, who would be unimpressed by the anti-libertarianism of a considerable minority on LW.)
E.Y. has built a mystique about himself. Here’s this self-schooled prodigy who has somehow managed to build a massive rationalist community and to preside over a half-million dollar nonprofit, living the good life of working only 4 hours per day (per LukeProg) and in that time, performing only tasks he likes to do, while being paid handsomely? It’s a success story that’s impressive. Even if you don’t think E.Y. is a great philosopher, you have to admire him (at least the way Arnold Schwartzeneger once said he admired Hitler). It does the Yudkowsky myth no service to learn that he had the help of a billionaire, who almost singlehandedly funded his operations. If I’ve puzzled for years about the secret of E.Y. success, now I know it. He has a billionaire friend.
Caveat Unlike many others here, I don’t like that there are billionaires. They’ve made a mockery of American politics, and their whimsical “charitable” support to intellectual factions will make a mockery of American intellectual life.
We consider this tabloid reporting. I’ve removed the link from the OP because our policy is not to reward trolling (by reporters or anyone else) with publicity or Google rank. OP can add the link back in if they wish and I won’t remove it again, but please keep in mind that we would like to ask politely that you don’t. You should also feel free to delete this post entirely, which you can do by setting the status back to “Draft”.
I disagree with removing the link strongly. It makes you look defensive at best.
Yes, and the oblique suggestion to remove even the mere mention of the newspaper article makes Eliezer look like a). an authoritarian who can’t stomach any media exposure that he does not control, and b). someone who’s never heard of the Streisand Effect.
Oh no, someone told the internet about your polyamorous cuddle-piling cohabiting group of people! Did you not expect those things to get talked about it if you achieved any level of fame? Considering the judgement laid down on politicians for hints of inappropriateness, you should either make your relationships more normal and mainstream, or just learn to deal with people attacking you for the weirdness.
To be honest, as a long term supporter of SIAI, this sort of social experimentation seems like a serious political blunder. I personally have no problem with finding new (or not part of current western culture) techniques of… social interactions… if you believe it will make yourself and others ‘better’ for some definition of better.
But if you are serious in actually getting the world behind the movement, this is Bad. “Why should I believe you when you seem to be amoral?”. I have more arguments on this matter but they are easy to generate anyway.
Another thought: one way to think of it might be that to achieve your goals personal sacrifice is necessary and applauded: ‘I’m too busy saving the world to have a girlfriend.’. Perhaps there are better examples than that. Maybe it’s time to get rid of couches?
If we took this argument seriously, we (at least those of us in the United States) would have to pretend to be Christians, too.
Optimizing your life to minimize the worst that Mrs Grundy can say about you is a losing proposition — even if Mrs Grundy is a trendy New York gossip columnist rather than a curtain-twitching busybody neighbor.
http://www.youtube.com/watch?v=tqDFGpd845Y
AFAIK, most of the people in the article are not SI employees, so as a criticism of SI this seems odd. SI can hardly dictate what other people do with their personal lives.
Well, hiding things like this or stopping doing them is possibly even worse as far as image is concerned.
There’s also the issue of demonstrating rationality. If they claim to that being rational will change your life and make you happier, but seem to live exactly like everyone else, then their claims hold less force than if being more rational makes you do seemingly weird things. There’s arguments to be made that making an effort to tone down the weirdness is counter to the goal of promoting radically different means of thought than most people are used to.
I’m pro-poly, pro-cuddlepiles, and pro-cohabiting, I just think it’s silly to do all these things and then act shocked when someone else points out that they are weird.
Lets be honest about ‘demonstrating rationality’ here. If your goals are to have much more romping in the bedroom, they have done well here. However many of these techniques speak to me of cults, the ones with the leader getting all the brainwashed girls.
A much better sign of rationality is to have success in career, in money, in fame—to be Successful. Not to just have more fun. Being successful hasn’t been much demonstrated, though I am hopeful still.
The irony is that I recall a few years ago reading someone criticizing LWers to the effect that ‘I would be more impressed by their so-called rationality if they were losing their virginity or getting laid more, than the stuff they focus on’. So, the NYCers are apparently doing just that and the response is this?
(Truly, damned if you do and damned if you don’t.)
It would be nice if they had a list of awesome people who agree with Lesswrong or support it, instead of pointing to Thiel and Tallinn over and over.
Also we don’t want to forget the people this will attract. Being obviously happy works pretty well for Mormons, it might work well for Lesswrong. I dunno the value of the tradeoffs there but it’s probably non-negligible, especially if you’re interested in getting your group to skew younger.
Who’s included in “we”?
Good question, I would also like to know the answer to this, as well as to the following: which parts of the article do “we” believe are inaccurate ?
As noted elsewhere, the article is generally accurate and Eliezer conceded “hatchet job” wasn’t really the right word. But “we” includes most of the people featured in the article, many of whom did not know there was a reporter at a party.
Yes, that’s a good point. Recording people without their knowledge, and then publishing the article without their consent, is clearly unethical. That said, though, once the cat is out of the bag, attempting to cram it back in is futile.
I’ve said it before: what the SIAI needs are some real, tangible accomplishments. Then, Eliezer could just say, “Yeah, our cuddle puddles are sensational and all, but guess what ? You know about that cancer vaccine that got FDA-approved last week ? We wrote the software that built it. How do you like dem apples ?”
That was a bit of a hyperbole, of course, but hopefully the idea is clear.
I think SI needs to work on real tangible accomplishments to be able to fitful it’s stated mission at all, given it’s reliance on self education and importance of calibration to self-education.
I really don’t see it as much of a hatchet job. It reads to me like “these people are a bit strange, but interesting”, which I have trouble taking offense at. Certainly it picks and chooses the “interesting” stuff, but it doesn’t strike me as particular worse than normal human interest stories (judging by the very limited sample of news articles that I have close personal knowledge of the subjects of).
I suspect if this was actually a hatchet job (as in, the reporter really was intentionally trying to make LW look bad, or really didn’t like someone), it would be a lot worse.
Calling it a hatchet job seems… disingenuous. Especially given that I don’t see many specific objections being raised. Sure, it could be better, and it’s not something an insider would have written. But neither of those surprises me, based on what I know about journalists and news articles.
A hatchet job implies destruction as the goal. Usually the target would be someone or something the author finds threatening or dangerous. Targets that are perceived as legitimate or powerful get hack jobs. This was just pure mockery.
I agree. I don’t think this great publicity, but I don’t think that it is too actively bad particularly given the intended audience (this is the paper Sex and the City is based on I expect that the have a relatively pro-poly attitude). Furthermore, I think the negative aspects are due to the unfortunate(from our perspective) fact that the article was about the NY group as a tribe/lifestyle than about the singularity or rationality per se, and not the result of the kind of malice that “hatchet job” usually implies.
Many of the people mentioned are not in the New York group currently; they’re in Berkeley. However, New York media stereotypically see the world as revolving around New York.
It seems to me that some of the biggest tension between this article and the way LWers see ourselves is that the article is about people and their human quirks (living arrangements, sexual habits, and physical behavior), with the ideas presented as irrelevant eccentricities. Whereas within the LW-space, the ideas are pretty important. It’s like an article about Nikola Tesla that focuses on his affection for pigeons.
I think you’re right. Maybe if LW’s ideas bore more fruit in the external world, journalists would give them more airtime compared to gossip...
I am inclined to agree with your first request about not rewarding reporting like this with increased page rank. As such I won’t re-add the link.
However, I’m having trouble understanding why a discussion about a portrayal of LW in the media isn’t something worth discussing here.
“In early 2005, Google implemented a new value, “nofollow”, for the rel attribute of HTML link and anchor elements, so that website developers and bloggers can make links that Google will not consider for the purposes of PageRank—they are links that no longer constitute a “vote” in the PageRank system.”
I’m not sure how to use (or if it’s even possible to use) “nofollow” with the markup here, though.
Because the article is moderately negative.
I think most of the parts I’d see as “hatchet job” can be explained by sensationalism and are subverted by other parts of the article. Maybe the biggest subversion is that LW gets treated as a group of “others” in only two places, while much of the article is spent on humanizing people, which is unusual during a hatcheting. The impression I got was less “here’s this weird group of people, let’s apply the absurdity heuristic” and more “here’s this group of people who believe something interesting and unusual, and here is everything I could find out about their sex lives.” Which can still be bad for the people tabloidized (sorry!), but for LessWrong seems fine.
I’d be willing to swap “tabloid” for “hatchet”, sure.
How come I don’t see an “edited” star in the original comment? Is it because it was edited using Mod Powers?
This was an attempted hatcheting that ended up with the author really sincerely liking the group, but she still had to publish a hatchet piece because that’s what her editors were expecting.
Are you just speculating, or do you have a strong reason to believe this?
Speculation, but I’m somewhat confident that this is how the editorial process of a tabloid works.
First, before having read this post, I found a reference in the welcome thread about someone who joined Less Wrong because they read the article and found it interesting. They didn’t reference the article link in their welcome post, which naturally made me curious, and wanted to read it in a “Article? What Article, I don’t see a link to an Article.” way. I then found a link to the article online, and posted it to my comment here:
http://lesswrong.com/lw/90l/welcome_to_less_wrong_2012/739j
You can delete it if you feel necessary, although given the other heavily upvoted objections to doing so from removing it from the original post, you may feel that would not be necessary.
Secondly, is the policy linked somewhere that I just can’t find? If it isn’t, should it be somewhere like http://wiki.lesswrong.com/wiki/FAQ ? The only link I can find about you and hatchet/tabloid reporting is your article on http://lesswrong.com/lw/uu/why_does_power_corrupt/ where you mention such points as:
Meh. I think the main effect of removing the link was that, in order to get to the article, instead of spending about 0.2 seconds to click on the link, one has to spend nearly a second to select the title, right-click it, click “Search Google for ‘Faith …’”, and click on the top result.
If that hadn’t been true I would’ve been much more reluctant to delete the URL. I was trying to control Pagerank flow, not information discovery. Please feel free to mirror the text on your home site and direct readers there.
Did you know you can put a “nofollow” attribute on a link and control search ranking that way?
We don’t have sufficient development resources to do very basic things with the LW codebase. This is one of them.
You can edit the post manually and put it in. In the post editor, click on the “HTML” toolbar icon to go into source code mode and then change the link so it looks like
link text
Huh. I tested this and it appeared to work, which surprised me, because it’d been previously claimed that this HTML editor would filter all attributes not explicitly allowed (e.g. to filter Javascript misbehavior). Perhaps that one is explicitly allowed.
A good sign is that the author of the piece made a correction based on something said in the comments section.
What correction are you referring to?
I don’t see a record of a correction, the way journalists usually work.
Maybe it would be obvious what you talking about if I reread the article with it in mind, but I don’t want to do that.
TH
Someone who works for SingInst choosing to overwrite Eliezer’s decision strikes me as unlikely. Not that Eliezer is likely to be aware that Malo works for them, via Luke. (I am fairly sure this is the same Malo.)
Eliezer: Even without considering the extra influence you have over Malo it may have been more practical for you to request via comment and by personal message that the OP remove the link. Very few people would refuse—many may even be more inclined to take the offer to add it back than to refuse to remove it. You could expect the link to remain for only a few hours longer and the likely reception by the community would likely switch from significantly negative to neutral or slightly positive.
This is a case where the degree of power exercised in the two options would be technically almost equivalent but the perception of the intervention would be entirely different—especially if you put some more thought into how to word your comment. Or, heck, just get Luke to do it or ghost write it for you.
Anissimov (the Media Director) doesn’t, Kevin doesn’t. Vassar doesn’t.
Hmm, weird how people here go all libertarian just because EY does not want to contribute to some half-assed article’s Google rank.