I was surprised to see the initial spike of downvotes. Stabilizer suggested a model for a not-at-all-uncommon thinking pattern and asked for more information. Whether the model is good is debatable, but the post itself is certainly not below the average quality for Discussion.
Earlier today, I noticed this post published in main a score of −7. Later I noticed it was moved to discussion, and the score slowly increased from there.
There may be other reasons to downvote. For example thinking that publishing this article is likely to create a ton of negative utility. We live in a world, among humans. There are consequences. Now it depends on one’s model of how will people react on articles and discussions like this; and also a model of how would the discussion develop if this article gets positive karma.
Are you saying that some readers might be afraid that someone proposing that suicide can be a status raising move may encourage someone to go through with it? This seems a bit far fetched.
As I mentioned before, the popularity of this forum does not approach that of news media. Not even close. I’m guessing that even similar posts on Reddit rank much higher. And given how balanced the discussion here is, the potential for neg utility is minuscule.
As for your specific search query, it’s fragile, as none of the synonyms or similar queries I could think of (not posting them here to avoid accidental indexing) result in the article being anywhere close to the first page in the search results.
If someone is already considering suicide, helping them contemplate the topic from many aspects will increase the probability of really doing it. I don’t have a good estimate about how much the probability would increase, but my guess is that the expected damage is far greater than expected benefits of having this specific discussion.
For me this topic is kind of a taboo. In theory, there is nothing wrong about discussing suicide between psychologically stable people. The problem is, depressed people usually don’t see themselves as unfit to participate in such a discussion; they are probably even more likely to start it or join it. I don’t want to participate in possibly providing the last straw for someone.
The article itself is not the whole risk; the comments (assuming the article starts a large discussion) would be a greater risk. The more different perspectives, the higher chance that one of them would impress a fragile mind.
I see your point, though it seems overly cautious to me. Reasons for suicide are discussed online and in print all the time, by scientists, fiction writers, poets and just random folks. I doubt that a single post on a single forum is likely to make a difference either way, not nearly enough to make it a taboo.
If someone is already considering suicide, helping them contemplate the topic from many aspects will increase the probability of really doing it.
Or maybe it will increase the probability that they realize it was a bad idea. How do you know?
And anyway, for some people in certain circumstances committing suicide may be a rational action.
The article itself is not the whole risk; the comments (assuming the article starts a large discussion) would be a greater risk. The more different perspectives, the higher chance that one of them would impress a fragile mind.
So that’s another basilisk? Hmm, it seem to me that we can put the other basilisk to counter it: Don’t kill yourself or … :D
And anyway, for some people in certain circumstances committing suicide may be a rational action.
That was exactly my point. I mean, that sooner or later someone would write something like this.
Now imagine a depressed person reading that, and thinking: “even the smart people on LessWrong agree with me” (because for a depressed person if someone could be in a situation where suicide is a rational action, they believe it’s them in the first place).
OK, I give up. Seems like explaining why I believe discussing something is wrong only has the opposite effect.
The majority of posters here are in the prime demographic to suicide, and are indeed susceptible to arguments in favor of far-fetched premises without evidence, i.e. revival of cryonicists by a machine intelligence. However, their strong belief in this prospect will insulate them against suicide attempts just as devout Christians are protected by their belief that hell awaits suicides and that heaven is possible for those meeting a natural end.
The majority of posters here are in the prime demographic to suicide
Did you mean that the posters are drawn from a demographic which has suicidal tendencies (young adults), or that Lesswrong is a demographic which has a higher proportion of people with suicidal tendencies?
Young males, often single, that is the demographic (though I believe that IQ is inversely correlated). Religion is a protective factor, and though singularitarian is not a recognized religion (though SIAI is tax exempt) its adherents hold beliefs that should have the same effect as those held by more orthodox believers.
That may be part of it and im not sure if it was controlled for but the study i read specifically focused on the beliefs, for instance do you believe suicide is morally wrong, do you believe in hell. Of lesswrongers they could ask do you believe in resurection through cryonics, or another possible question: does a babyfucking await anyone who commits suicide rather than maximizes the chances for FAI.
In many cases, religions provide a being/entity/cosmic absolute/”intrinsic property” which is
Outside of conventional human understanding
A source emotional significance, labelled “transcendent”
Emphasizes emotional experience of the transcendent over intellectual understanding, due to it being outside of conventional human understanding anyway.
Do singularitarians have a such a “transcendent constant”? Is there an “instrinsic property” which was compatible with hard materialism? What would a “cosmic absolute” be? What would “babyfucking” be?
Yes, that transcendant focus is the weakly, and eventually strongly, godlike AGI! Babyfucking is what awaits those who know it needs help to come to fruition and instead do less than their best to make that happen. Suiciding would be a great shortfall indeed. More minor sins, resource misallocations, may be forgiven if they are for the greater good. For example I could donate $10 to SIAI or I could see a movie. The latter will lead to eternal damnation, I mean babyfucking, unless I believe that the purchase will enhance my ability to contribute to the AGI’s construction down the road.
Is that a term Yudkowsky came up with? What is with him and doing horrible things to babies?
Even so, the godlike AGI is still recognized as a real world object, through which conveniences, resources and luxury flow, not an intrinsic, personal part of experience. I say transcendent in the spiritual context.
To reduce the number of hedons associated with something that should not have hedons associated with its discussion, I will refer to the subject of this discussion as the Babyfucker.
Even so, the godlike AGI is still recognized as a real world object
While religious people think of their gods as fictional objects?
through which conveniences, resources and luxury flow, not an intrinsic, personal part of experience. I say transcendent in the spiritual context.
Singularitarians (at least the Kurzweil-Chalmers-Yudkowsky variant) believe that when the time will come, people will upload their minds to computers, where they will enjoy enourmously increased mental abilities and sensory experiences, and possibly even merge into some kind of collective mind.
In that case, the best way I can differentiate between singularitarian transcendence and spiritual transcendence is that the former is based on a future expectation. A spiritual person can believe that they are experiencing transcendence at the present moment, or at least believe that the greater powers that be can utilized in their present lives, through prayer, contemplation, ritual or meditations. A singularitarian can hold no such belief, and is essentially biding their time until the transcendent AI is actually created. How many singularitarians have the mental stamina to hold the belief that the greatest experience of their lives is somewhere far away from their immediate situation? I’d go so far as to say that a belief like that, if held too tightly, will cause a person perpetual dissatisfaction and boredom.
In short, I’d say that some of the difference between the mental health of spiritualists and singularitarians can be attributed to the former getting more immediate results.
I find it plausible. All else equal, most people prefer doing things that raise their status, and someone suggesting that suicide’s high-status is evidence that suicide raises one’s status. (I do agree with you that an LW post’s quantitative effect is almost certainly small, though.)
Fair question. I’d expect an LW post & discussion of similar size to this one to cause 10^-3 to 10^-4 suicides, but I might now be anchoring on your reply. (Also, that expected value only counts one side of the ledger; I’m ignoring the possibility of the discussion discouraging people from committing suicide.)
If the chance would be p=0.001 that the post kills someone, is that small?
To me, it feels big, in the sense of disproportionate, but small in absolute terms.
Some Lesswrongers don’t like seeing posts that are obviously wrong, rather than interestingly wrong. Unfortunately, a subset of that set also don’t want to go through the trouble of explaining why.
I was surprised to see the initial spike of downvotes. Stabilizer suggested a model for a not-at-all-uncommon thinking pattern and asked for more information. Whether the model is good is debatable, but the post itself is certainly not below the average quality for Discussion.
Earlier today, I noticed this post published in main a score of −7. Later I noticed it was moved to discussion, and the score slowly increased from there.
Oh, I didn’t notice that it was in Main originally. That would certainly explain the downvote.
Huh, I didn’t know that I had posted on Main accidentally. Weird. No intention though.
There may be other reasons to downvote. For example thinking that publishing this article is likely to create a ton of negative utility. We live in a world, among humans. There are consequences. Now it depends on one’s model of how will people react on articles and discussions like this; and also a model of how would the discussion develop if this article gets positive karma.
Are you saying that some readers might be afraid that someone proposing that suicide can be a status raising move may encourage someone to go through with it? This seems a bit far fetched.
No, it’s not far fetched. Newspaper reports of the sucide of celebrities increase sucide rates. http://www.samaritans.org/media-centre/media-guidelines-reporting-suicide is a fairly straightforward media guide on how to handle talking in print in a public forum about suicide.
As of now the article ranks top for googling “high status suicide”.
As I mentioned before, the popularity of this forum does not approach that of news media. Not even close. I’m guessing that even similar posts on Reddit rank much higher. And given how balanced the discussion here is, the potential for neg utility is minuscule.
As for your specific search query, it’s fragile, as none of the synonyms or similar queries I could think of (not posting them here to avoid accidental indexing) result in the article being anywhere close to the first page in the search results.
If you think that it’s good when the article has a low readership than it makes sense to vote it down.
If someone is already considering suicide, helping them contemplate the topic from many aspects will increase the probability of really doing it. I don’t have a good estimate about how much the probability would increase, but my guess is that the expected damage is far greater than expected benefits of having this specific discussion.
For me this topic is kind of a taboo. In theory, there is nothing wrong about discussing suicide between psychologically stable people. The problem is, depressed people usually don’t see themselves as unfit to participate in such a discussion; they are probably even more likely to start it or join it. I don’t want to participate in possibly providing the last straw for someone.
The article itself is not the whole risk; the comments (assuming the article starts a large discussion) would be a greater risk. The more different perspectives, the higher chance that one of them would impress a fragile mind.
If anything, being able to think about their motivations for doing makes it more likely that they’ll realize it’s a really dumb move.
I see your point, though it seems overly cautious to me. Reasons for suicide are discussed online and in print all the time, by scientists, fiction writers, poets and just random folks. I doubt that a single post on a single forum is likely to make a difference either way, not nearly enough to make it a taboo.
Or maybe it will increase the probability that they realize it was a bad idea. How do you know?
And anyway, for some people in certain circumstances committing suicide may be a rational action.
So that’s another basilisk? Hmm, it seem to me that we can put the other basilisk to counter it: Don’t kill yourself or … :D
That was exactly my point. I mean, that sooner or later someone would write something like this.
Now imagine a depressed person reading that, and thinking: “even the smart people on LessWrong agree with me” (because for a depressed person if someone could be in a situation where suicide is a rational action, they believe it’s them in the first place).
OK, I give up. Seems like explaining why I believe discussing something is wrong only has the opposite effect.
The majority of posters here are in the prime demographic to suicide, and are indeed susceptible to arguments in favor of far-fetched premises without evidence, i.e. revival of cryonicists by a machine intelligence. However, their strong belief in this prospect will insulate them against suicide attempts just as devout Christians are protected by their belief that hell awaits suicides and that heaven is possible for those meeting a natural end.
Did you mean that the posters are drawn from a demographic which has suicidal tendencies (young adults), or that Lesswrong is a demographic which has a higher proportion of people with suicidal tendencies?
Young males, often single, that is the demographic (though I believe that IQ is inversely correlated). Religion is a protective factor, and though singularitarian is not a recognized religion (though SIAI is tax exempt) its adherents hold beliefs that should have the same effect as those held by more orthodox believers.
Not necessarily. One of the big protective aspects of religion is its community. Singularitarians, by vice of their small numbers, have less of that.
That may be part of it and im not sure if it was controlled for but the study i read specifically focused on the beliefs, for instance do you believe suicide is morally wrong, do you believe in hell. Of lesswrongers they could ask do you believe in resurection through cryonics, or another possible question: does a babyfucking await anyone who commits suicide rather than maximizes the chances for FAI.
In many cases, religions provide a being/entity/cosmic absolute/”intrinsic property” which is
Outside of conventional human understanding
A source emotional significance, labelled “transcendent”
Emphasizes emotional experience of the transcendent over intellectual understanding, due to it being outside of conventional human understanding anyway.
Do singularitarians have a such a “transcendent constant”? Is there an “instrinsic property” which was compatible with hard materialism? What would a “cosmic absolute” be? What would “babyfucking” be?
Yes, that transcendant focus is the weakly, and eventually strongly, godlike AGI! Babyfucking is what awaits those who know it needs help to come to fruition and instead do less than their best to make that happen. Suiciding would be a great shortfall indeed. More minor sins, resource misallocations, may be forgiven if they are for the greater good. For example I could donate $10 to SIAI or I could see a movie. The latter will lead to eternal damnation, I mean babyfucking, unless I believe that the purchase will enhance my ability to contribute to the AGI’s construction down the road.
Is that a term Yudkowsky came up with? What is with him and doing horrible things to babies?
Even so, the godlike AGI is still recognized as a real world object, through which conveniences, resources and luxury flow, not an intrinsic, personal part of experience. I say transcendent in the spiritual context.
It’s the most instant-squick-flinch-inducing thing he can imagine.
EY wrote:
While religious people think of their gods as fictional objects?
Singularitarians (at least the Kurzweil-Chalmers-Yudkowsky variant) believe that when the time will come, people will upload their minds to computers, where they will enjoy enourmously increased mental abilities and sensory experiences, and possibly even merge into some kind of collective mind.
I’d say this is as ‘spiritual’ as it gets.
In that case, the best way I can differentiate between singularitarian transcendence and spiritual transcendence is that the former is based on a future expectation. A spiritual person can believe that they are experiencing transcendence at the present moment, or at least believe that the greater powers that be can utilized in their present lives, through prayer, contemplation, ritual or meditations. A singularitarian can hold no such belief, and is essentially biding their time until the transcendent AI is actually created. How many singularitarians have the mental stamina to hold the belief that the greatest experience of their lives is somewhere far away from their immediate situation? I’d go so far as to say that a belief like that, if held too tightly, will cause a person perpetual dissatisfaction and boredom.
In short, I’d say that some of the difference between the mental health of spiritualists and singularitarians can be attributed to the former getting more immediate results.
I find it plausible. All else equal, most people prefer doing things that raise their status, and someone suggesting that suicide’s high-status is evidence that suicide raises one’s status. (I do agree with you that an LW post’s quantitative effect is almost certainly small, though.)
I think pointing out that something is high status typically makes it less high-status though… reduces the mystique somehow.
How small is small? If the chance would be p=0.001 that the post kills someone, is that small?
Fair question. I’d expect an LW post & discussion of similar size to this one to cause 10^-3 to 10^-4 suicides, but I might now be anchoring on your reply. (Also, that expected value only counts one side of the ledger; I’m ignoring the possibility of the discussion discouraging people from committing suicide.)
To me, it feels big, in the sense of disproportionate, but small in absolute terms.
Some Lesswrongers don’t like seeing posts that are obviously wrong, rather than interestingly wrong. Unfortunately, a subset of that set also don’t want to go through the trouble of explaining why.