I’m not so sure that this post is something I need to see. I was pointing out parallels in Eliezer’s language to something you would hear from an evangelist.
If there is a specific point you’d like to discuss I’d be happy to do that.
If there is a specific point you’d like to discuss I’d be happy to do that.
You started this thread with a vague claim. If you want talk about specifics, you should quote something that Eliezer has said and explain what Christian overtones you think it has. Pointing to the word “saved” without any context is not enough.
But for now it can help the rest of us save the world.
(Probably some paraphrasing but the quotes are in the videos).
So other quotes were in the vimeo video, but these mainly concern the argument that the singularity is obviously the number one priority. Also troubling to me is the idea the the world is irredeemably flawed before the emergence of FAI. Christianity very much rests on the notion that the world is hopeless without redemption from god.
So the similarity mainly lies in the notion that we need a savior, and look we have one! The you will die without cryonics is sort of icing on the cake.
To all this I would mainly argue what Jaron Lanier does here:
There is a lot to like about this world and a lot of problems to work on. However, it is ridiculous to assert you know the number one priority for earth when you have no evidence that your project will be nearly as successful as you think it will be.
You have only weak surface similarities, which break down if you look deeper.
In the Christian concept, people need to be saved from the moral punishment for a sin committed before they were born, and this salvation is available only by accepting the religion, and it is absolutely morally right that those who do not accept the religion are not saved, on the authority of a supremely powerful being. The salvation consists of infinite boredom rather than infinite pain after you die.
On the other hand, the concept of an FAI saving the world involves saving people from the harsh reality of an impersonal universe that does not care about us, or anything else. The salvation is for anyone it is in the FAI’s power to save, the requirement of cryonics is only because even a superintelligence would likely not be able have enough information about a person to give them new life after their brain had decayed. If it turns out that the FAI can in fact simulate physics backwards well enough to retrieve such people, that would be a good thing. People who happen to be alive when the FAI goes FOOM will not be excluded because they aren’t signed up for cryonics. The salvation consists of as much fun as we can get out of the universe, instead of non-existence after a short life.
To all this I would mainly argue what Jaron Lanier does here
Lanier’s argument, within the time you linked to, seemed to consist mostly of misusing the word ideology. Throughout the diavlog, he kept accusing AI researchers and Singularians of having a religion, but he never actually backed that up or even explained what he meant. While he seemed to be worshiping mystery, particularly with regards to consciousness, and was evasive when Eliezer questioned him on it.
Consider me incredibly underwhelmed to hear a recitation of Eliezer’s views.
It is humorous that you simply assert that Lanier just misuses the word ideology. What I find compelling is his advice to simply do the work and see what can be done.
Eliezer is a story teller. You like his stories and apparently find them worth retelling. Far out. I expect that is what you will always get from him. Look for results elsewhere.
I’m not so sure that this post is something I need to see. I was pointing out parallels in Eliezer’s language to something you would hear from an evangelist.
If there is a specific point you’d like to discuss I’d be happy to do that.
You started this thread with a vague claim. If you want talk about specifics, you should quote something that Eliezer has said and explain what Christian overtones you think it has. Pointing to the word “saved” without any context is not enough.
I thought people would have seen the videos, and thus what I was talking about this in context. Oh well here are quotes:
http://www.youtube.com/watch?v=vecaDF7pnoQ#t=2m26s
That’s how the world gets saved.
http://www.youtube.com/watch?v=arsI1JcRjfs#t=2m30
The thing that will kill them when they don’t sign up for cryonics.
http://www.youtube.com/watch?v=lbzV5Oxkx1E#t=4m00s
But for now it can help the rest of us save the world.
(Probably some paraphrasing but the quotes are in the videos).
So other quotes were in the vimeo video, but these mainly concern the argument that the singularity is obviously the number one priority. Also troubling to me is the idea the the world is irredeemably flawed before the emergence of FAI. Christianity very much rests on the notion that the world is hopeless without redemption from god.
So the similarity mainly lies in the notion that we need a savior, and look we have one! The you will die without cryonics is sort of icing on the cake.
To all this I would mainly argue what Jaron Lanier does here:
http://bloggingheads.tv/diavlogs/15555?in=00:46:48&out=00:51:08
While Eliezer asserts that he will cure AIDS.
There is a lot to like about this world and a lot of problems to work on. However, it is ridiculous to assert you know the number one priority for earth when you have no evidence that your project will be nearly as successful as you think it will be.
You have only weak surface similarities, which break down if you look deeper.
In the Christian concept, people need to be saved from the moral punishment for a sin committed before they were born, and this salvation is available only by accepting the religion, and it is absolutely morally right that those who do not accept the religion are not saved, on the authority of a supremely powerful being. The salvation consists of infinite boredom rather than infinite pain after you die.
On the other hand, the concept of an FAI saving the world involves saving people from the harsh reality of an impersonal universe that does not care about us, or anything else. The salvation is for anyone it is in the FAI’s power to save, the requirement of cryonics is only because even a superintelligence would likely not be able have enough information about a person to give them new life after their brain had decayed. If it turns out that the FAI can in fact simulate physics backwards well enough to retrieve such people, that would be a good thing. People who happen to be alive when the FAI goes FOOM will not be excluded because they aren’t signed up for cryonics. The salvation consists of as much fun as we can get out of the universe, instead of non-existence after a short life.
Lanier’s argument, within the time you linked to, seemed to consist mostly of misusing the word ideology. Throughout the diavlog, he kept accusing AI researchers and Singularians of having a religion, but he never actually backed that up or even explained what he meant. While he seemed to be worshiping mystery, particularly with regards to consciousness, and was evasive when Eliezer questioned him on it.
Consider me incredibly underwhelmed to hear a recitation of Eliezer’s views.
It is humorous that you simply assert that Lanier just misuses the word ideology. What I find compelling is his advice to simply do the work and see what can be done.
Eliezer is a story teller. You like his stories and apparently find them worth retelling. Far out. I expect that is what you will always get from him. Look for results elsewhere.