Is that a term Yudkowsky came up with? What is with him and doing horrible things to babies?
Even so, the godlike AGI is still recognized as a real world object, through which conveniences, resources and luxury flow, not an intrinsic, personal part of experience. I say transcendent in the spiritual context.
To reduce the number of hedons associated with something that should not have hedons associated with its discussion, I will refer to the subject of this discussion as the Babyfucker.
Even so, the godlike AGI is still recognized as a real world object
While religious people think of their gods as fictional objects?
through which conveniences, resources and luxury flow, not an intrinsic, personal part of experience. I say transcendent in the spiritual context.
Singularitarians (at least the Kurzweil-Chalmers-Yudkowsky variant) believe that when the time will come, people will upload their minds to computers, where they will enjoy enourmously increased mental abilities and sensory experiences, and possibly even merge into some kind of collective mind.
In that case, the best way I can differentiate between singularitarian transcendence and spiritual transcendence is that the former is based on a future expectation. A spiritual person can believe that they are experiencing transcendence at the present moment, or at least believe that the greater powers that be can utilized in their present lives, through prayer, contemplation, ritual or meditations. A singularitarian can hold no such belief, and is essentially biding their time until the transcendent AI is actually created. How many singularitarians have the mental stamina to hold the belief that the greatest experience of their lives is somewhere far away from their immediate situation? I’d go so far as to say that a belief like that, if held too tightly, will cause a person perpetual dissatisfaction and boredom.
In short, I’d say that some of the difference between the mental health of spiritualists and singularitarians can be attributed to the former getting more immediate results.
Is that a term Yudkowsky came up with? What is with him and doing horrible things to babies?
Even so, the godlike AGI is still recognized as a real world object, through which conveniences, resources and luxury flow, not an intrinsic, personal part of experience. I say transcendent in the spiritual context.
It’s the most instant-squick-flinch-inducing thing he can imagine.
EY wrote:
While religious people think of their gods as fictional objects?
Singularitarians (at least the Kurzweil-Chalmers-Yudkowsky variant) believe that when the time will come, people will upload their minds to computers, where they will enjoy enourmously increased mental abilities and sensory experiences, and possibly even merge into some kind of collective mind.
I’d say this is as ‘spiritual’ as it gets.
In that case, the best way I can differentiate between singularitarian transcendence and spiritual transcendence is that the former is based on a future expectation. A spiritual person can believe that they are experiencing transcendence at the present moment, or at least believe that the greater powers that be can utilized in their present lives, through prayer, contemplation, ritual or meditations. A singularitarian can hold no such belief, and is essentially biding their time until the transcendent AI is actually created. How many singularitarians have the mental stamina to hold the belief that the greatest experience of their lives is somewhere far away from their immediate situation? I’d go so far as to say that a belief like that, if held too tightly, will cause a person perpetual dissatisfaction and boredom.
In short, I’d say that some of the difference between the mental health of spiritualists and singularitarians can be attributed to the former getting more immediate results.