The majority of posters here are in the prime demographic to suicide
Did you mean that the posters are drawn from a demographic which has suicidal tendencies (young adults), or that Lesswrong is a demographic which has a higher proportion of people with suicidal tendencies?
Young males, often single, that is the demographic (though I believe that IQ is inversely correlated). Religion is a protective factor, and though singularitarian is not a recognized religion (though SIAI is tax exempt) its adherents hold beliefs that should have the same effect as those held by more orthodox believers.
That may be part of it and im not sure if it was controlled for but the study i read specifically focused on the beliefs, for instance do you believe suicide is morally wrong, do you believe in hell. Of lesswrongers they could ask do you believe in resurection through cryonics, or another possible question: does a babyfucking await anyone who commits suicide rather than maximizes the chances for FAI.
In many cases, religions provide a being/entity/cosmic absolute/”intrinsic property” which is
Outside of conventional human understanding
A source emotional significance, labelled “transcendent”
Emphasizes emotional experience of the transcendent over intellectual understanding, due to it being outside of conventional human understanding anyway.
Do singularitarians have a such a “transcendent constant”? Is there an “instrinsic property” which was compatible with hard materialism? What would a “cosmic absolute” be? What would “babyfucking” be?
Yes, that transcendant focus is the weakly, and eventually strongly, godlike AGI! Babyfucking is what awaits those who know it needs help to come to fruition and instead do less than their best to make that happen. Suiciding would be a great shortfall indeed. More minor sins, resource misallocations, may be forgiven if they are for the greater good. For example I could donate $10 to SIAI or I could see a movie. The latter will lead to eternal damnation, I mean babyfucking, unless I believe that the purchase will enhance my ability to contribute to the AGI’s construction down the road.
Is that a term Yudkowsky came up with? What is with him and doing horrible things to babies?
Even so, the godlike AGI is still recognized as a real world object, through which conveniences, resources and luxury flow, not an intrinsic, personal part of experience. I say transcendent in the spiritual context.
To reduce the number of hedons associated with something that should not have hedons associated with its discussion, I will refer to the subject of this discussion as the Babyfucker.
Even so, the godlike AGI is still recognized as a real world object
While religious people think of their gods as fictional objects?
through which conveniences, resources and luxury flow, not an intrinsic, personal part of experience. I say transcendent in the spiritual context.
Singularitarians (at least the Kurzweil-Chalmers-Yudkowsky variant) believe that when the time will come, people will upload their minds to computers, where they will enjoy enourmously increased mental abilities and sensory experiences, and possibly even merge into some kind of collective mind.
In that case, the best way I can differentiate between singularitarian transcendence and spiritual transcendence is that the former is based on a future expectation. A spiritual person can believe that they are experiencing transcendence at the present moment, or at least believe that the greater powers that be can utilized in their present lives, through prayer, contemplation, ritual or meditations. A singularitarian can hold no such belief, and is essentially biding their time until the transcendent AI is actually created. How many singularitarians have the mental stamina to hold the belief that the greatest experience of their lives is somewhere far away from their immediate situation? I’d go so far as to say that a belief like that, if held too tightly, will cause a person perpetual dissatisfaction and boredom.
In short, I’d say that some of the difference between the mental health of spiritualists and singularitarians can be attributed to the former getting more immediate results.
Did you mean that the posters are drawn from a demographic which has suicidal tendencies (young adults), or that Lesswrong is a demographic which has a higher proportion of people with suicidal tendencies?
Young males, often single, that is the demographic (though I believe that IQ is inversely correlated). Religion is a protective factor, and though singularitarian is not a recognized religion (though SIAI is tax exempt) its adherents hold beliefs that should have the same effect as those held by more orthodox believers.
Not necessarily. One of the big protective aspects of religion is its community. Singularitarians, by vice of their small numbers, have less of that.
That may be part of it and im not sure if it was controlled for but the study i read specifically focused on the beliefs, for instance do you believe suicide is morally wrong, do you believe in hell. Of lesswrongers they could ask do you believe in resurection through cryonics, or another possible question: does a babyfucking await anyone who commits suicide rather than maximizes the chances for FAI.
In many cases, religions provide a being/entity/cosmic absolute/”intrinsic property” which is
Outside of conventional human understanding
A source emotional significance, labelled “transcendent”
Emphasizes emotional experience of the transcendent over intellectual understanding, due to it being outside of conventional human understanding anyway.
Do singularitarians have a such a “transcendent constant”? Is there an “instrinsic property” which was compatible with hard materialism? What would a “cosmic absolute” be? What would “babyfucking” be?
Yes, that transcendant focus is the weakly, and eventually strongly, godlike AGI! Babyfucking is what awaits those who know it needs help to come to fruition and instead do less than their best to make that happen. Suiciding would be a great shortfall indeed. More minor sins, resource misallocations, may be forgiven if they are for the greater good. For example I could donate $10 to SIAI or I could see a movie. The latter will lead to eternal damnation, I mean babyfucking, unless I believe that the purchase will enhance my ability to contribute to the AGI’s construction down the road.
Is that a term Yudkowsky came up with? What is with him and doing horrible things to babies?
Even so, the godlike AGI is still recognized as a real world object, through which conveniences, resources and luxury flow, not an intrinsic, personal part of experience. I say transcendent in the spiritual context.
It’s the most instant-squick-flinch-inducing thing he can imagine.
EY wrote:
While religious people think of their gods as fictional objects?
Singularitarians (at least the Kurzweil-Chalmers-Yudkowsky variant) believe that when the time will come, people will upload their minds to computers, where they will enjoy enourmously increased mental abilities and sensory experiences, and possibly even merge into some kind of collective mind.
I’d say this is as ‘spiritual’ as it gets.
In that case, the best way I can differentiate between singularitarian transcendence and spiritual transcendence is that the former is based on a future expectation. A spiritual person can believe that they are experiencing transcendence at the present moment, or at least believe that the greater powers that be can utilized in their present lives, through prayer, contemplation, ritual or meditations. A singularitarian can hold no such belief, and is essentially biding their time until the transcendent AI is actually created. How many singularitarians have the mental stamina to hold the belief that the greatest experience of their lives is somewhere far away from their immediate situation? I’d go so far as to say that a belief like that, if held too tightly, will cause a person perpetual dissatisfaction and boredom.
In short, I’d say that some of the difference between the mental health of spiritualists and singularitarians can be attributed to the former getting more immediate results.