When you say you believe this, do you mean you believe it to be the case, or you believe it to be a realistic possibility?
I stumbled across Tipler when reading up on the simulation argument, and it inspired further “am I being a crackpot” self-doubt, but I don’t think this argument looks much like his. Also, I am not really trying to promote it so much as to feel it out. I have not yet found any reason to think I am wrong about it being a possibility, though I myself do not “feel” it to be likely. That said, with stuff like this, I have no sense that intuitions would tell me anything useful either.
“Despite that, the general idea of mind uploading into virtual afterlife appears to be pretty mainstream now in transhumanist thought (ie Turing Church).”
Yeah, it comes up in “Superintelligence” and some other things I have read too. The small difference, if there is one, is that this looks backwards, and could be a way to collect those who have already died, and also could be a way to hedge bets for those of us who may not live long enough for transhumanism. It also circumvents the teletransportation paradox and other issues in the philosophy of identity. Also, even when not being treated as a goal, it seems to have evidential value. Finally, there are some acausal trade considerations, and considerations with “watering down” simulations through AI “thought crimes,” that can be considered once this is brought in. I will probably post more of my tentative thoughts on that later.
“I think it’s fun stuff to discuss, but it has a certain stigma and is politically unpopular to some extent with the x-risk folks. I suspect this may have to do with Tipler’s heavily Christian religious spin on the whole thing. Many futurists were atheists first and don’t much like the suspicious overlap with Christian memes (resurrection, supernatural creators. ‘saving’ souls, etc)”
The idea of posting about something that is unpopular on such an open-minded site is one of the things that makes me scared to post online. Transhumanism, AI risk (“like the Terminator?”), one-boxing the Newcomb Paradox, LW seems pretty good at getting past some initial discomfort to dig deeper. I had actually once heard a really short thing about “The Singularity” on the radio, which could have been a much earlier introduction to all this, but I sort of blew it off. Stuff like my past flippancy makes me inclined to try to avoid trusting my gut, and superficial reasons to ignore something, and to try to take a really careful approach to deconstructing argument. I am also atheist, and grew up very religiously Christian, so I think I also have a strong suspicion and aversion to its approach. But again, I try not to let superficial or familial similarity to things interrupt a systematic approach to reality. I am currently trying to transition from doing one-the-ground NGO work in developing countries in order to work on this stuff. My gut hates this, and my availability bias is doing backflips, but I think that this stuff might be too important to take the easy way out of it.
Also, your point about the hook is absolutely correct. I was sort of trying to imitate the “catchy” salon/huffpost/buzzfeed headline that would try to draw people in. “Ten Ways Atheists Go to Heaven, You Won’t Believe #6!” It was also meant a bit self-deprecatingly.
“There are also local considerations which may dominate for most people—resurrection depends on future generosity which is highly unlikely to be uniform and instead will follow complex economics. “Be a good, interesting, and future important person” may trump x-risk for many people that can’t contribute to x-risk much directly.”
Yeah, there is a lot here. What is so weird about the second disjunct is that it means that we sort of do this or fail at this as a group. And it means that, while laying on my deathbed, my evaluation of how well we are doing as a species is going to play directly on my credence of what, if anything, comes next. It’s strange isn’t it? That said, it is also interesting that, even if we somehow knew that existential risk would not be a problem in our lifetime, with this, there is a purely selfish reason to donate to FHI/MIRI. In fact, with the correct sense of scale, with high enough odds and marginal benefit to donations, it could be the economically rational thing to do.
When you say you believe this, do you mean you believe it to be the case, or you believe it to be a realistic possibility?
Well naturally I believe the latter, but I also believe the former in the sense of being more likely true than not.
I stumbled across Tipler when reading up on the simulation argument, and it inspired further “am I being a crackpot” self-doubt, but I don’t think this argument looks much like his.
Tipler isn’t a full crackpot. His earlier book with Barrow—the Anthropic Cosmological Principle—was important in a number of respects and influenced later thinkers such as Kurzweil and Bostrom.
Tipler committed to his particular physical cosmology which is now out of date in light of new observations. Cosmological artificial selection (evolution of physics over deep time via creation of new ‘baby’ universes by superintelligences) is far more likely. In any kind of multiverse, universes which reproduce will dominate in terms of observer measures.
considerations with “watering down” simulations through AI “thought crimes,” that can be considered once this is brought in.
Not sure what you mean by this.
The idea of posting about something that is unpopular on such an open-minded site is one of the things that makes me scared to post online.
Don’t let that stop you. You can post it on your blog then discuss it here and elsewhere. LW discussion is more open minded these days.
I am also atheist, and grew up very religiously Christian, so I think I also have a strong suspicion and aversion to its approach.
I was an atheist until I heard the sim argument and I then updated immediately.
It is interesting to look at the various world religions in light of Simulism and the Singularity. Some of the beliefs end up being inadvertently correct or even prescient.
For example, consider beliefs concerning burial vs cremation. It’s roughly 50⁄50 split across cultures/religions over time. Both are effective from a health/sanitation point of view, but burial is somewhat more expensive. Judeo-christian religions all strongly believe in burial (cremation was actually outlawed in medieval europe). Hinduism on the other hand strongly supports cremation.
In the standard (pre-singularity) atheist worldview, these are are just arbitrary rituals.
However, we now know that this couldn’t be farther from the truth. Burial preserves DNA for thousands, if not tens of thousands of years. So at some point in the near future robots can extract all of that DNA and use it to help in resurrection simulations. Obviously having someone’s DNA is just the beginning of the information that you need for mind reconstruction, but it’s a very important first step.
There are number of other features/beliefs like this that western religions (and xtianity strains in particular) probably got right. The general idea of a future hard eschatology (end of human history—rapture/singularity), resurrection of the dead, afterlife reward judgement by future super-intelligence, divinization/deification) (humans becoming gods) …
When you say you believe this, do you mean you believe it to be the case, or you believe it to be a realistic possibility?
I stumbled across Tipler when reading up on the simulation argument, and it inspired further “am I being a crackpot” self-doubt, but I don’t think this argument looks much like his. Also, I am not really trying to promote it so much as to feel it out. I have not yet found any reason to think I am wrong about it being a possibility, though I myself do not “feel” it to be likely. That said, with stuff like this, I have no sense that intuitions would tell me anything useful either.
“Despite that, the general idea of mind uploading into virtual afterlife appears to be pretty mainstream now in transhumanist thought (ie Turing Church).”
Yeah, it comes up in “Superintelligence” and some other things I have read too. The small difference, if there is one, is that this looks backwards, and could be a way to collect those who have already died, and also could be a way to hedge bets for those of us who may not live long enough for transhumanism. It also circumvents the teletransportation paradox and other issues in the philosophy of identity. Also, even when not being treated as a goal, it seems to have evidential value. Finally, there are some acausal trade considerations, and considerations with “watering down” simulations through AI “thought crimes,” that can be considered once this is brought in. I will probably post more of my tentative thoughts on that later.
“I think it’s fun stuff to discuss, but it has a certain stigma and is politically unpopular to some extent with the x-risk folks. I suspect this may have to do with Tipler’s heavily Christian religious spin on the whole thing. Many futurists were atheists first and don’t much like the suspicious overlap with Christian memes (resurrection, supernatural creators. ‘saving’ souls, etc)”
The idea of posting about something that is unpopular on such an open-minded site is one of the things that makes me scared to post online. Transhumanism, AI risk (“like the Terminator?”), one-boxing the Newcomb Paradox, LW seems pretty good at getting past some initial discomfort to dig deeper. I had actually once heard a really short thing about “The Singularity” on the radio, which could have been a much earlier introduction to all this, but I sort of blew it off. Stuff like my past flippancy makes me inclined to try to avoid trusting my gut, and superficial reasons to ignore something, and to try to take a really careful approach to deconstructing argument. I am also atheist, and grew up very religiously Christian, so I think I also have a strong suspicion and aversion to its approach. But again, I try not to let superficial or familial similarity to things interrupt a systematic approach to reality. I am currently trying to transition from doing one-the-ground NGO work in developing countries in order to work on this stuff. My gut hates this, and my availability bias is doing backflips, but I think that this stuff might be too important to take the easy way out of it.
Also, your point about the hook is absolutely correct. I was sort of trying to imitate the “catchy” salon/huffpost/buzzfeed headline that would try to draw people in. “Ten Ways Atheists Go to Heaven, You Won’t Believe #6!” It was also meant a bit self-deprecatingly.
“There are also local considerations which may dominate for most people—resurrection depends on future generosity which is highly unlikely to be uniform and instead will follow complex economics. “Be a good, interesting, and future important person” may trump x-risk for many people that can’t contribute to x-risk much directly.”
Yeah, there is a lot here. What is so weird about the second disjunct is that it means that we sort of do this or fail at this as a group. And it means that, while laying on my deathbed, my evaluation of how well we are doing as a species is going to play directly on my credence of what, if anything, comes next. It’s strange isn’t it? That said, it is also interesting that, even if we somehow knew that existential risk would not be a problem in our lifetime, with this, there is a purely selfish reason to donate to FHI/MIRI. In fact, with the correct sense of scale, with high enough odds and marginal benefit to donations, it could be the economically rational thing to do.
Well naturally I believe the latter, but I also believe the former in the sense of being more likely true than not.
Tipler isn’t a full crackpot. His earlier book with Barrow—the Anthropic Cosmological Principle—was important in a number of respects and influenced later thinkers such as Kurzweil and Bostrom.
Tipler committed to his particular physical cosmology which is now out of date in light of new observations. Cosmological artificial selection (evolution of physics over deep time via creation of new ‘baby’ universes by superintelligences) is far more likely. In any kind of multiverse, universes which reproduce will dominate in terms of observer measures.
Not sure what you mean by this.
Don’t let that stop you. You can post it on your blog then discuss it here and elsewhere. LW discussion is more open minded these days.
I was an atheist until I heard the sim argument and I then updated immediately.
It is interesting to look at the various world religions in light of Simulism and the Singularity. Some of the beliefs end up being inadvertently correct or even prescient.
For example, consider beliefs concerning burial vs cremation. It’s roughly 50⁄50 split across cultures/religions over time. Both are effective from a health/sanitation point of view, but burial is somewhat more expensive. Judeo-christian religions all strongly believe in burial (cremation was actually outlawed in medieval europe). Hinduism on the other hand strongly supports cremation.
In the standard (pre-singularity) atheist worldview, these are are just arbitrary rituals.
However, we now know that this couldn’t be farther from the truth. Burial preserves DNA for thousands, if not tens of thousands of years. So at some point in the near future robots can extract all of that DNA and use it to help in resurrection simulations. Obviously having someone’s DNA is just the beginning of the information that you need for mind reconstruction, but it’s a very important first step.
There are number of other features/beliefs like this that western religions (and xtianity strains in particular) probably got right. The general idea of a future hard eschatology (end of human history—rapture/singularity), resurrection of the dead, afterlife reward judgement by future super-intelligence, divinization/deification) (humans becoming gods) …