The Singularity as Religion (yes/no links)
My own opinion is that it’s not worth much to argue over the boundaries around a vague term like ‘religion,’ and of course the question should not be ‘Does the Singularity hypothesis share some features with religious hypotheses’ but instead ‘Is the Singularity hypothesis plausible, and what are its likely consequences?’
There is a subset of the pro-Singularity individuals that is acting in an a very religious fashion. See prior discussion here where ata pointed to the Singularity 2045 Facebook group which includes the text:
This isn’t just a small group of random people either. Michael Anissimov and Aubrey de Grey are both administrators.
I would hope this is simply a case of both of these individuals joining any singularity-related FB groups for PR, and the original admin seeing this and granting them admin privileges.
Yup.
Does anyone know where the 2045 figure came from?
Is there anything more to it than “hmm, we need a date that’s distant enough to not strain plausibility but close enough that most people expect to still be alive”.
The exact number is Kurzweil’s predicted date from his book “The Singularity is Near.”
Well, I suppose that answer’s the question of why the facebook group uses it.
Any idea why Kurzweil chose it? Was there any kind of quantitative thinking involved?
I’m not accusing anyone, its a question that currently leaves me puzzled, so I’d be interested in seeing if he has any kind of justification.
Projecting his double exponential growth of computer hardware he gets total computations by computers exceeding computations in human brains (using his estimate) by a factor of a billion around then I think.
Thanks, I’ll have to look into it further.
It is one of the more common times for the Singularity. Timtyler made a graph a while ago of Singularity claims and I think that the mean was around 2040. I suspect that you’ve hit part of what is going on, as well as general wishful thinking. This SMBC seems relevant.
I was similarly somewhat alarmed by that when I found it, but I think for the most part it’s just one very (um...) enthusiastic person. (curiousepic (below) is almost certainly correct about why some non-crazy persons are administrators of the Facebook group.) I’d bet that nobody actually does the monthly 20:45 SINGULARITY MESSAGES thing.
This particular failure doesn’t include the further hundreds of people who joined.
Admin on Facebook isn’t opt-in, it’s more like they clicked “yes” when invited to join the group and then the group creator set them as admins in order to increase the status of his group.
Myself, I am less disturbed by people taking the Singularity as if it were the Rapture than I am by people taking the Singularity as if it were just the excuse they needed to feel happy about slacking off and underachieving.
“The Singularity is my retirement plan”, that sort of thing. Ew.
“The Singularity is my retirement plan” is actually a rational attitude to have given a slight possibility of a successful singularity. This should be a motivating factor not an excuse for slacking. “You can’t take it with you” and “you’re going to die anyway” stop being legitimate excuses. It prompts desperate action.
There are different questions here:
Plausibility of the idea itself, in various specific senses.
Sanity of specific groups associated with the idea.
In case of Singularity, neither of these senses is trivial. Quite a few Singularity believers are cult material, and there are senses of “Singularity” that are clearly wrong.
Well, it might conceivably be worth asking the question “Does the Singularity hypothesis share enough features with religious hypotheses that organizations dedicated to thinking about it run a significant risk of demonstrating other attributes of religious/theological organizations?”
Along with the related “If so, would that be a bad thing, and what could we do to mitigate that risk?”
That said, my own answers are “Not especially, although some of the same sorts of people who would otherwise be attracted to religious concept-clusters will become attracted to Singularitarian concept-clusters for this reason, as the latter become more popular.” and “It’s not ideal, but it’s tolerable. As long as we’re careful to distinguish between reasoning and confabulating, we should be OK.”
And if I’m right, then this isn’t a particularly important question to devote energy to.
You’re forgetting the most important aspect of the issue. If there is a problem with technology-related existential risk, then it’s important to get high-status people to understand it and take it seriously. However, if the issue is automatically associated in the public mind with low-status people and presumed crackpots, this will become far more difficult. It doesn’t matter how good a case you have that the problem is serious, if its very mention will trigger people’s crackpot heuristics and make them want to distance themselves from you for fear of low-status contamination.
I suppose.
Though it seems like the easiest way to engage with that aspect is from the other direction: figure out what the high-status “paint” is and start engaging in discussions of the issue using that paint.
Though if “the Singularity” is already tarred with low status, then presumably this isn’t the right location to do that.
Wally Weaver: You see, at the time I was misquoted. I never said ‘The Super-man exists and he is American’, what I said was ‘God exists and he is American’. Now if you begin to feel an intense and crushing feeling of religious terror at the concept, don’t be alarmed. That indicates only that you are still sane. (Watchmen)
There is something to it.
There’s a rather uncommon theological position—espoused by Paolo Soleri (and perhaps by others) - that God, the rapture, etc. are better regarded as a potential future, as something we have a responsibility to create, than as something pre-existing; in this view, religious texts can be viewed as imperfect but still visionary accounts of what such a thing might look like. The Singularity hypothesis seems to fit better in this model of religion than in more mainstream models. Soleri’s theology seems far less pathological than religions tend to be, since it calls for both concrete action and accurate models of reality, so maybe this isn’t such a bad thing.
The line I came up with, when asking the question to myself, was this: If the singularity is a religion, it is the only religion with a plausible mechanism of action.
Religion isn’t just a set of “hypotheses”, though; it’s also a set of human social behaviors. Religions entail various sorts of group and individual practices — such as worship ritual, fellowship, prayer or meditation, study of received texts, adherence to charismatic leaders, moral correction of “straying” members, instruction of children, evangelism of adults, rites of passage (including baptisms, weddings, and funerals), financial support of institutions and leaders, and so forth.
Not every “religion” has all of these, and some institutions that exhibit these behaviors, we would say are not “religion”. (So “religion” may be kind of like Wittgenstein’s “game”.) But often when people refer to a movement or group as being “like a religion” (or “cultish” for that matter), they’re referring to practices like these, rather than beliefs. The more “religion-like” practices a group has, the more likely people are to think of it as one.
Ok, I was hoping that these would be actual break downs of the beliefs of transhumanists and how such point to the fact that they are a religion. Unfortunately, it was a lot less helpful then that.
Transhumanism, at least as expressed on this site (which hereafter I will just refer to as transhumanism), is as far as I can tell a religion. It is in fact a standard Abrahamic religion in form. Here is my reasoning for saying that:
1,) Belief in the elect. Transhumanists feel that they alone are rational enough to see the truth, to the point that some of them believe that it is better that others are not told the truth.
2.) Belief in God. Transhumanists believe that they will make a Friendly AI that will take the place of God.
3.) Belief in Resurrection. Transhumanists believe in Cryogenics such that when they die they will be frozen and eventually resurrected.
4.) Belief in Immortality. Transhumanists believe in life extension to the point that eventually their bodies will be able to live indefinitely long.
5.) Belief in Eternal Life. Transhumanists believe that eventually uploads will be possible such that even in the case of an unfortunate accident they will be able to survive.
6.) Belief in Theosis/Exaltation. Transhumanists believe that it may become possible to become one with their AI/God
7.) Belief that everyone that disagrees with them is a heretic. Aumann’s Agreement theorem has serious problems in it, yet it is used in arguments as though it were true. If someone doesn’t hold the “correct” beliefs about AI or other topics it is automatically assumed that they are less capable or less rational than those that do. This in spite of the fact that they may have additional information that is influencing their priors in important ways.
8) Sacred Symbols. At least those that are signed up for Cryonics have symbols that they wear and use to spark interest in their beliefs or debates about their beliefs.
9) Belief in the coming of a messiah—the singularity.
10) Holy Script—The sequences.
Ok, now that I have you fired up, please deconstruct this. Most especially try deconstructing this argument as though you were explaining things to someone that didn’t understand the science or philosophic theories that these belief structures are built off of. Realize that many of the people on this site do not have the training to understand these theories as far as I can tell.
Whether “X is a religion” is not a very substantive query, particularly if category-membership is not clear. It’s an argument about definitions, a rube/blegg question. An interesting question would be, say, whether lesswrong/transhumanists/rationalists are on an epistemic death spiral and will end up (or already are) comfortably compartmentalized (or not) deluded lunatics, like so many religious people. But this statement would need arguments focused on it in particular, and not just general reference class tennis that assigns connotationally related categories.
See 37 Ways That Words Can Be Wrong and Diseased thinking: dissolving questions about disease.
So, a great deal has already been said about your reasons for concluding that “transhumanism is a religion,” and I don’t think I have anything especially useful to add to that discussion.
I’m curious, though, as to why it matters.
That is… OK, suppose I accept your reasoning in its entirety. (I don’t, but never mind that for now.)
That is, suppose I accept that soi-disant transhumanists who endorse certain texts (#10) and believe in an upcoming singularity (#9) and wear identifying symbols (#8) and dismiss those who disagree with them (#7) and believe it may become possible for human minds to join with a superhuman AI (#6/#2) and believe it’s theoretically possible to extend human life and/or consciousness indefinitely (#4/#5) and to reconstruct human consciousness from brain cells (#3) and that they are better able to see the truth of these things than most people (#1)… suppose I accept that they are, by virtue of those things and those things alone, a religion.
To put this another way: suppose I accept extending the set “religion” to accept the intersection of those things.
What interesting consequences follow from my accepting that? Why should it matter?
First it should place such statements about other religions on the same level as Gnostics calling everyone else unenlightened for worshiping the Demiurge.
Second, it should call into question the statements that religion is irrational and that those that follow this are more rational than anyone else
Third, Everyone should be aware of this in order to realize that these beliefs are biasing their assessment of other peoples rationality and capabilities.
Fourth, Transhumanism should be treated by society in the same way as any other religion; That is it should not be taught in the public school system and should be recognized as religious in nature when discussed in college classrooms.
That is why it matters in my opinion.
I’m not at all sure what you mean by your first point.
Regarding your second point, it seems that you are arguing by definition. If religion is so broad as to include some very vague cluster of transhumanist beliefs, then it also includes organized sports and a variety of other things. At that point, people will probably agree that religion isn’t necessarily irrational for the simple reason that religion includes so much.
Regarding your third point, that seems for related reasons to just not be helpful. Even if something is rational is a “religion” that would not stop some religions from being more rational than others. For instance, I suspect that you agree that ultra-Orthodox Jews who insist for religious reasons that mice can spontaneously generate are being more irrational than many other religions.
Regarding point four, as far as I’m aware transhumanism isn’t taught in public schools or treated in colleges pretty much at all. So I don’t see why you care about this. Indeed, if some form almagam of Singularitarianism and transhumanism gets recognized a religion, the only practical consequence as far as I can tell is that Eliezer Yudkowsky would be eligble for the parsonage allowance. So are you saying that the IRS should make things easier for the Singularity Institute?
Yes, absolutely.
I can think of two posts where teaching transhumanism in public schools is explicitly advocated. Hence, it is already being thought of.
Yes they are. On a tangent to that note, would it be kosher to grow non-kosher insects in apples as it would be thought they spontaneously generate and are therefore kosher?
Anyways, the point is that the assumption is made that religion is by default irrational when it might actually be rational. Some of the greatest philosophers have been highly religious. Also, mystical experiences are accepted as fact here as long as they are not religious mystical experiences (see benefits of madness) ( which thing most religions will tell you is very dangerous as there are beings that are wanting to deceive). So I don’t see the arguments of this site being especially more rational than the most rational of religious people as being valid. There are also instances of people claiming they are more rational than Noble laureates which doesn’t help the claim that they are rational at all.
The bounds of religion already contains things that have less of a structure of beliefs then Transhumanism (both as expressed on this site as well as described on the Wikipedia article).
Christianity says “murder is bad.” does that mean saying to kids in public schools “hey, murder is bad” becomes off limits? A similar remark would apply to religions which have correct beliefs about the shape of the world. Note that even if an idea or set of ideas is subscribed to be a religion that’s not an intrinsic reason to exclude it from the classroom. That’s part of why in the US First Amendment issues are so complicated and difficult.
No. They believe that some animals can (lice and rodents in particular) can spontaneously generate, not that they in general necessarily do. However, there’s actually a serious issue connected to this in that figs often have wasps inside them but this wasn’t known in ancient times when figs were first ruled to be kosher (and in fact more than that are explicitly listed as one of the special foods of Israel in the Bible). No one in the Orthodox community has a good answer for this.
Well, again you are using a very broad notion of religion. The individuals asserting that religion is in general irrational are probably less likely to do so if one is using such a broad definition.
And if one uses a narrow definition of religion, then for most major religions, if any one of them is correct, then the others need to be not just irrational but deeply so. So even in that framework, thinking that most religions are irrational is the rational thing to do.
So? That’s not a good argument for much of anything at all. Newton also believed in alchemy. Erdos had trouble with the Monty Hall problem. Or for that matter, some of the the greatest philosophers have been geocentrists. The fact that some people happened to make great advancements and were religious doesn’t say much at all aside from the fact that humans have strong cognitive biases. And many of those great philosophers were by no fault of there own living in ages where a lot of the basic understanding of the world we take for granted war completely missing. If I lived in 1200, I’d think that lightning came from an angry God and that disease had similar causes.
The claim is that mystical experiences can be useful not that they are necessarily factual. Speaking for myself, I’ve had practical math ideas from dreams. I check them when I wake up. Sometimes they are correct, and sometimes they don’t quite work and sometimes they are clearly nonsense. (No, subconscious, the fact that Martin van Buren was the 8th President of the United States does not tell me anything about zeros of L-functions.) But that’s not a reason think that any such dream is objectively real (Last night I had a dream where I was a powerful sorcerer who was working together with some cyborg elves to stop a terrible demonic menace. I wouldn’t mind if that’s real.)
It seems here that you are interpreting remarks uncharitably, in trying to find the most negative and most religious interpretation of comments. Indeed, the post in question was made in part in response to a general attitude here that mystical experiences are a complete waste that are not at all helpful in any way.
Note that the difference between a mystical experience which leads to a math idea or leads to the structure of benzene is that the claim is something we can check. The religious mystical experiences very rarely fall into that category. But there’s another reason not to take the religious mystical experiences seriously: they all disagree. Every major religion has people who have had mystical experiences and claims that they’ve encountered their deity or its servants or something similar, and yet they all disagree about very basic parts of how the world works. Indeed, that’s probably why some religions claim that there are evil forces out there giving deceptive mystical experiences: one needs some explanation for why every other group has experiences which don’t agree.
Smart is not at all the same thing as rational. I have no doubt that Rober Aumann is smarter than I am. I don’t know if which of us is more rational but there’s a decent argument that I am. I know I’m not as smart as Kary Mullis and I’m pretty sure that I’m more rational than he is, and for that matter, judging from these conversations, you probably are too. That people here are more rational than some Nobel Prize winners isn’t a positive statement about people here as much as it is an interesting statement about how irrational people can be and still do absolutely amazing, incredibly brilliant, highly innovative work.
This is covered in different comments.
Well, considering I am highly religious and find my religion to be highly rational as well as thinking that there are many more ways to find knowledge then just repeated applications of bayes theorem I have to question your assertion of my rationality as such things are considered on this site.
Unless there really are evil forces out there.
Probably, however given the tone of the mystical answers series I don’t see how you should expect me to do otherwise. I have attempted to be extremely restrained on the subject, especially when responding to comments.
Sounds like a the start of a novel to me. Something similar to how Stephenie Meyer got started on Twilight. Maybe if you write it up you will end up making millions.
Is Confucianism a religion? If it is then Transhumanism is as well, if it is not then Transhumanism in general is not currently a religion (although for some people it may be, but some people take the most random things for their religion so that doesn’t say much).
Could you point me to where it is? I don’t see it.
I didn’t say that you were highly rational. I said you seemed to be more rational than Kary Mullis. That’s the point. There are some very irrational Nobel winners. Rationality is not the same thing as intelligence, or creativity or many other important traits.
Well, say hypothetically you were in the position of Screwtape or some other classical demon and you need to draft a policy for what sort of fake mystical forces your demons should do. Which do you think would work better, answering every apparent attempt to get a mystical experience with something that the believer expects to be true, or answering every one of them with a revelation about a specific religion made up by the demons. The second seems a lot more effective. So why don’t they do this? Let me suggest that there’s a simple reason: the existence of such forces is the perfect post-hoc explanation, that’s why so many different religions even when they vehemently disagree can agree on some form of this.
I’m generally not inclined to see Confucianism as a religion, although it has strong religious elements. That’s why for example there can be self-identifying Chinese Christians who also practice aspects of Confucianism. In a very similar way, there’s no reason why someone couldn’t be a member of a major religion and still subscribe to what you have listed as transhumanist ideas. The danger of Strong AI could be plausible even in a religious framework (indeed, possibly even more so if one thinks that an intelligent artificial entity would be lacking the moral compass that comes from having a “soul”). Similarly, I’ve had serious discussions with Orthodox Jews over way in an Orthodox framework cryonics should actually be halachically mandatory. Almost every single step you list that is a belief related to a specific technology I could probably find at least one religion which is theologically sympathetic to that belief. So if anything, this looks similar to Confucianism in exactly the ways that Confucianism doesn’t look like a classic religion.
Moreover, the aspects of Confucianism that most resemble a religion are precisely the parts that the transhumanist cluster lacks. Confucianism has veneration of ancestors and sacrifices, and a belief in some forms that worshiped ancestors can intervene in the world. These are classical beliefs that we associate with religions. Nothing you gave resembles anything like that. So this argument if anything undermines your claim, there’s an argument over whether Confucianism is a religion, and the most religious-like aspects of Confucianism are precisely the sorts of things which have no analog in the transhumanist cluster you’ve list.
You missed my point entirely. I conceded that I might be using too broad of a category with the title of religion and pointed to an example where there is debate over whether it counts as a religion or not to determine membership in the category. (incidentally part of the argument is over whether ancestor worship is part of Confucianism or is from traditional Chinese “heaven” worship). Since you do not consider Confucianism a religion then Transhumanism is not a religion, as I conceded.
Actually I think mystical experiences with things the believer expects to be true works fine once the religion of the believer has already been modified away from strict truth. This creates groups that believe in very different things and have reinforcing experiences such that if the truth were attempted to be restored it would face social momentum against it rather then be a constant among confusion. As far as I can tell though both tactics have been used depending on what can be made to work.
I consider this to have been covered in theOtherDave line of discussions.
Transhumanism in general? Or just transhumanism of the sort we’re discussing?
E.g., suppose Sam is a transhumanist in this sense—that is, Sam affirms the possibility and desirability of fundamentally transforming the human condition by developing and making widely available technologies to eliminate aging and to greatly enhance human intellectual, physical, and psychological capacities.
But suppose further that Sam isn’t a transhumanist in the sense you’re discussing—for example, suppose Sam doesn’t endorse any particular text (#10), and doesn’t particularly believe that such a fundamental transformation of the human condition needs to involve anything like the coming of a messiah (#9), and doesn’t wear any special symbols (#8), and engages respectfully with people who disagree (#7), and doesn’t consider other people significantly less able to see the truth (#1).
Does it still follow, in your opinion, that Sam’s beliefs about the possibility and desirability of technologically enhancing human capacity should not be taught about in public schools, necessarily introduce biases into Sam’s judgment of others, etc. etc.?
If Sam has specific ways that are currently applicable in transforming the human condition and is doing so voluntarily and with out profit to himself then what he is practicing is indeed pure religion. “Pure religion is this, To visit the fatherless and widows in their affliction, and to keep himself unspotted from the world.” (James 1:27) However, this is not what is generally thought of as religion and is not regulated as such.
If Sam has specific ways that are currently applicable in transforming the human condition and is doing so for profit then what he is doing is a business and not a religion, at least by itself.
If Sam does not have specific ways that are currently applicable in transforming the human condition then we may be getting into a belief structure that needs to be looked at in considering if it is a religion. If he has any of the structure to his beliefs mentioned in the Wiki article then it would indeed count as a religion. If he does not have the structured beliefs then it does not count as a religion.
If Sam has phrases or words that lets him identify those in the “in” group then that of necessity introduces some amount of bias when dealing with those that are not in that group. However, this is not just a characteristic of religion as it is seen with RPG players, technical fields, among friends, and in other situations.
Teaching that it might be possible to enhance human capacity should, by itself, be non-controversial (e.g. glasses, prosthetics, eye surgery, heart surgery, and tool use in general). In fact, enhancing our own capacity above its natural limits is what defines humans in the first place.
It is when the teachings reaches into theoretical structures that are not grounded in current reality but on beliefs that the problem may arise. Stick to beliefs about technology within the next ten years and you will be fine, go beyond ten years and you are essentially saying that fusion power will arrive in twenty years (or strong AI). Except instead of the one technology that you are working on you list off dozens more equivalents of fusion power and then dozens more for each successive decade past the first. It quickly moves from being science to becoming religion.
So teaching that regenerating organs might soon be common place is not a religious statement. Teaching that humans will be able to rewrite ourselves in order to make Chimeras is. Teaching that computers will most likely continue to increase in processing power and that Strong AI might be possible is not a religious statement. Teaching that not only is Strong AI possible but that humans will be uploaded is. Hopefully you can see the distinction.
So I understand what you mean, but I don’t understand why you mean it. That is, this seems to be an extremely abnormal, somewhat vague, notion of religion and creates serious issues of where the lines are that are at best very blurry. I don’t think that most people would use religion to include any people trying to make long-term predictions about technology. In one of the World Book encyclopedia year books from the 1960s (I don’t remember the exact year unfortunately, I can go look it up when I return next to my parents’ house), there’s an essay by Issac Asimov about the future of space travel. He describes a large set of milestones that he expects over the next 300 years and when they should occur. Would teaching that Asimov made those predictions be too religious in your view? Wold it matter if he had a bunch of people who took those predictions very seriously? And if they thought that people should be educated about them? And if they said things like “we should do this because this is the last frontier” and “So there’s the choice in life. One either grows or one decays; Grow or die. I think we should grow.” Is that too religious?
This is especially relevant in the context of a legal setting since you care in part about whether any of these ideas get taught or mentioned in schools. If you tried on this sort of basis to argue that discussing uploading and strong AI in public schools was unconstitutional, you’d probably be laughed out of court.
No, that is a fact that can be confirmed.
How are they taking it seriously? Are they sitting around designing rockets without having studied how to do so and without the ability to implement those designs? Are they fervently believing the predictions but not attempting to do things that are likely to help them come true? How much of their world view is affected by these predictions?
Not having specifics I am inclined to say most likely no.
I am a biased judge on this point. I am in favor of space travel. I know Asimov was as well. I think that Asimov was in a position to know something about space travel so therefore I am in favor of educating people about whatever he said (unless he said something completely crazy).
It would depend on the context.
Strong AI would get me laughed out of court by itself (probably). Uploading however could be made into a decent case.
Even stronger then someone that did not believe in the specifics of transhumanism would be someone that did believe in transhumanism attempting to gain the status of a religion. That would in my opinion get through the courts without any difficulties.
Ok. So why isn’t the fact that serious philosophers like David Chalmers find uploading plausible not something that can be taught in public schools?
The standard of what constitutes a religion for purposes of First Amendment issues is complicated and subject to dispute, but no legal scholar thinks that “I think he was correct” is a reason to include a religious statement in school. The correctness of a religion cannot be correct. (Incidentally, according to the Asimov essay we should have had a colony on Mars about 20 years ago and should be preparing our mission to Pluto.)
Really? I’m curious as to how you would present such a case.
I’m not sure what you mean. Can you expand?
That was the plan, Apollo was supposed to be a sustained and expanding program with nuclear rockets following shortly after. Instead we got the space shuttle and billions spent on programs that go nowhere with any cheap or good technology getting discarded or sold as implementing it would cause unemployment in certain key congressional districts.
Assuming you have a public school that teaches anything on philosophy then I don’t think this is a problem.
The definition for a religion is fairly broad, pretty much if you claim to be a religion and believe the claims that you make then you are considered a religion. So if say SIAI wished to incorporate as a religion I am pretty sure it could and if challenged could point to the meet-up groups, these boards, and the training that it does and probably not even have to go to an appellate court on the issue.
I think I understand the distinction you’re making, yes. Thanks for clarifying.
Incorrect. Transhumanism and rationalism are different issues. Some people self-identify as rational but not transhumanist. And outside Less Wrong, most transhumanists do not consider themselves to be rationalist in any useful sense. If you simply mean that transhumanists think they’ve recognized something as important that others have not, then any movement thinks that, whether religious or not religious.
Again, confusing Less Wrong attitudes about AI with general transhumanism. But even then, making a deity would be very different than having a pre-existing deity (although I see why someone of LDS background would see less distinction there). But those who want to make a Friendly AI don’t intend to worship it or the like. Moreover, many people here who don’t think that an AI is likely to foom think that it is important to figure out how to make Friendly AI because non-Friendly non-fooming AI can still do a lot of damage.
This is only a superficial comparison, and many people who are into cryonics (note, not cryogenics- they are different things) don’t self-identify as transhumanist. Moreover, this is very different because many of the people, even the proponents like Robin Hanson, estimate low chances of cryonics actually working. Could one imagine a major religion saying that “yeah, so if you do what we want, maybe, in a few centuries you might come back if people in that day and age get around to it?”
One certainly has some transhumanists who think that the technology will soon be at that level. But that’s not at all the same as it applying to current people. This isn’t belief in immortality. This is belief that if our technology increases at a fast enough rate, then we will reach actuarial escape velocity.
Well, in so far as, if our current understanding of the human brain is correct, and one self-identifies with all functionally equivalent copies then they are correct.
Same issues with the issue of comparing the AI to a deity as earlier.
Really? I don’t buy into almost any of the standard transhumanist claims. I’ve got a karma over 4000. I’ve made a lot of posts criticizing cryonics, the probability that AI will go foom, and Bayesianism, and almost all of them have gotten voted up. Indeed, I just made another post about AI not going foom and it is getting voted up as we speak and is now at +4 for, frankly, not doing much at all other than raising an issue with no details. I presume from your remarks that Less Wrong is the environment we are talking about. So if so, heretics apparently are treated very well here.
What would it mean in this context for someone to be a heretic? One common idea is that heretics will suffer or not do as well in a messianic age. Well, I’ve haven’t gotten to discussing your notion of the Singularity as a messianic age, (which is #9 on your list) but let’s say for now that that is a good analogy. Are heretics going to suffer? No. At worst, if cryonics works, then maybe some of the less damaged people who signed up for cryonics will still be around. Those who didn’t sign up will yes get oblivion. That last part has nothing to do with transhumanism unless transhumanism now means anyone who doesn’t believe in an afterlife. That’s obviously too broad. Moreover, someone who signs up for cryonics but doesn’t think Friendly AI is worthwhile, or doesn’t meet any of your other criteria would be just as saved in that scenario. Now, another popular way of dealing with heretics is to kick them out of a community. That doesn’t seem to happen either. I can’t see any way in which anyone is being treated functionally as a heretic.
Um, Aumann’s Agreement is a theorem. It can’t have problems. You can argue that people are misapplying it, and I’d be inclined to agree.
Yes, but that’s exactly what Aumann says would be the expected problem among rational agents, lack of shared information, possibly some combination of information on both sides. There are, IMO, deep problems with how people apply Aumann’s theorem here, but this isn’t one of them.
This is stretching. They need those symbols to signal to medical professionals that they are signed up. And as someone who isn’t signed up for cryonics but is considering it, let me just note that if I see a cryonics symbol on someone I know that they are in fact likely to have interesting ideas, because they’ve gone and done something wildly outside the mainstream.
So this is possibly the best argument. Among some transhumanists there is an extremely optimistic attitude about the Singularity and an attitude about it that can only be described as evangelical. There’s been earlier discussion about this issue here, e.g. this. But that seems to be a small fraction of transhumanists as a whole, generally those of the Kurzweil disposition. But those people don’t meet much of what you list, not caring generally about Friendly AI or Aumann’s agreement theorem issues. Indeed, most of the people here who take the Singularity seriously consider it likely to be a very bad thing for humanity.
The sequences are a good set of essays that do a good job summarizing a lot of known scientific knowledge, some philosophical arguments, and a little bit of fun besides. For a set of holy scriptures they have a surprisingly large amount of people telling Eliezer when he’s just wrong. Moreover, to fit with your earlier analogy, this would require these scriptures to have been inspired by what, the supreme AI that doesn’t exist yet? That doesn’t seem to work. Note also that many people here disagree with major points of the Sequences. MWI and the Eliezer’s take on Metaethics are the two biggest ones. The person with third highest karma on the site, User:Alicorn isn’t even a utilitarian, which amounts to a not at all small rejection of most of Eliezer’s ethical attitudes.
I don’t think that deep science or philosophy is required here, just not trying to shove groups into standard categories when they don’t apply.
Nitpick: I get the idea a lot of people here are not utilitarians, what’s noteworthy about Alicorn is that she’s not a consequentialist.
Could you explain the distinction? Are you using the word utilitarian to refer to a classical ‘maximise pleasure minimise pain’ utilitarian?
I think that Sniffnoy is distinguishing between someone who pushes people onto train tacks and someone who does not. Alicorn is unwilling to push people onto train tracks.
It amuses me to no end that “Alicorn is unwilling to push people onto train tracks” is presented as a summary of how I go against consensus here.
In contrast, people on the tracks are not amused.
I thought that both utilitarians and consequentialists would push someone onto train tracks. Since he drew a distinction between the two I was wondering what it was.
Yes, they will. Consequentialists are a superset of utilitarians. Not all consequentialists are utilitarians. For example, one could be a consequentialist who will choose toture over dustspecks, but a utilitarian will not.
So what exactly is the difference?
Consequentialism decides between actions only by reducing them to expected outcomes (or probability distributions over outcomes), and comparing those outcomes. Utilitarianism is consequentialist, but with additional structure to how it compares outcomes. In particular, utilitarians combine uncertain outcomes by weighting them linearly with weights proportional to probability. Additionally, many (but not all) utilitarians also subdivide their utility functions by agent, specifying that individuals’ preferences are to be quantified and linearly combined.
Hm, that’s not how I was breaking it down. We really haven’t standardized on terminology here, have we? That would be a useful thing.
Here’s how I was using the terms:
Consequentialist—Morality/shouldness refers to a preference ordering over the set of possible universe-histories, not e.g. a set of social rules; consequences must always be considered in determining what is right. The ends (considered in totality, obviously) justify the means, or as Eliezer put it, “Shouldness flows backwards”. This description may be a bit too inclusive, but it still excludes people who say “You don’t push people onto the train tracks no matter what.”
Unnamed category—agents whose preferences are described by utility functions. This needn’t have much to do with morality at all, really, but an agent that was both rational and truly consequentialist would necessarily fall into this category, as otherwise it would (if I’m not mistaken) be vulnerable to Dutch bookings.
Utilitarian—Someone who not only uses a utility function, but uses one that assigns some number to each person P (possibilities include some measure of “net happiness”, or something based on P’s utility function… even though raw numbers from utility functions are meaningless...) and then computes utility from combining these numbers in some way that treats all people symmetrically (e.g. summing or averaging.)
Consequentialists still have to have an underlying “feel” for morality outside of consequentialism. That is, they need to have some preference ordering that is not itself consequentialist in nature be it social Darwinism, extreme nationalism, or whatever other grouping it may be.
Utilitarianism is a subset of consequentialism that gives as it preference ordering the over all happiness of society.
Yes, consequentialism is a criterion a system can satisfy, not a system in and of itself. Your definition of utilitarianism is too narrow, though, in that it seems to only include “classical utilitarianism”, and not e.g. preference utilitarianism.
I’m not sure that the distinctions are precise. As I understand it, a utilitarian assigns everyone a utility and then just takes the sum, and sees how to maximize that. A consequentialist might not calculate their end goals in that way even as they are willing to take any relevant actions to move the world into a state they consider better.
1 - I can give you that one. Still disturbing to read in multiple posts people advocate lying rather than telling the truth to people that aren’t in the “in” crowd.
2, 6 - how familiar are you with Gnosticism?
4 - that is just quibbling with how immortality is defined. Same with 3.
8- yes, it was stretching from anyone else s perspective except the LDS one, so I will give you it.
and
Again, how familiar are you with gnosticism?
7,10- Interesting points. That isn’t what I see happening on the main sequences but it does fit with the scoring of some of my comments. These are also the only two answers that actually address the argument to show that it is not a solidified religion, at least not on this site.
4 - I am setup for cryonics, and I do not ‘believe’ in ‘eternal life’. 1) Even if cryonics works, I am very unlikely to live forever; forever is a long time. I would probably live a long time though, and that sounds better than not. 2) I don’t think cryonics is very likely to work, but it has a high payoff and it’s not absurdly expensive, so it makes sense even if it has a (say) 5% chance of working.
9 - ‘believing’ in the singularity (I hate that word, but I am using it to refer to the time when an intelligence explosion (rapidly self improving intelligence) takes place) is more like Native Americans believing in European settlers after Columbus arrived. Big changes are coming. They might be good or they might be bad, but they’re coming, and you’d like to try to make them good.
This is an example of not being aware that your beliefs are beliefs. Where is the evidence of an intelligence explosion? Where is the AI that can pass a Turing test, leaving alone a strong AI doing so? Are you aware that AI is like Fusion in that it is always some 20 years in the future and has been for much more than 20 years?
What are you referring to?
LDS aren’t the only religion that has ritual items of clothing (compare for example the tzizit warn by many Orthodox Jews) so that’s one of the less stretched examples in that respect. But it is still silly since people carry specific things for medical reasons all the time. Thus, people who have allergies to common medications will sometimes have armbands to let doctors know.
Somewhat but I fail to see the relevancy. Can you expand?
Huh? Among other problems, see the remark that you are confusing transhumanism with ratonalism. You are taking a large different set of ideas some of which don’t normally even go together and putting them all together as an attempt to make something which (vaguely) resembles a religion.
Knowing About Biases Can Hurt People, many of the comments in Crises of Faith, there are other places but those are the two examples I have down for this.
The tenets of Gnosticism is that the God of the Bible is evil (this is constantly being brought up in the mysterious answers sequence) and here to ensnare us who are the true gods. (Also, that the material world is evil (see uploads)). To free ourselves from this state we must create a god (or ourselves become god), not necessarily that we are to worship this god. Now comes the boot-strapping problem, where does this knowledge come from and how to create this god? The answer is that we have always had it and must merely rediscover it using those that have seen the way as our guides. It is really just a confusion of the temple ceremony (if one is LDS and believes the temple ceremonies were had anciently) with the standard twists.
Hopefully that clears up how I see a connection and don’t see your objections on those points as being arguments against it being a religion.
They are all here on this site.
Um, really? The primary point there point there doesn’t seem to be about whether only the in-crowd should know about things but whether knowing about cognitive biases is safe for anyone. But maybe that’s just me.
But that doesn’t fit with your claim that the truth is being reserved for some elect in-group. No one (from my quick reading of the thread) is talking about just telling things to people who are already members of some super-secret transhumanist rationalist amalgam, just that there are some people out there who aren’t going to respond productively to being told certain things. So the claimed in-group includes a large number of people who haven’t ever heard of Less Wrong.
Regarding the issues with gnosticism, that’s not an accurate summary of gnosticism in that gnostics thought that there was a God superior to the Demiurge. This seems pretty different from a view which has no deity at all and a plan to construct something that if you squint at it in a funny way might sort of resemble a deity if certain results occur. And no one, absolutely no one, would claim that we somehow have to rediscover pre-existing knowledge to be more rational to help make Friendly AI. Indeed, Eliezer has repeatedly emphasized that an important part of science and actual productivity is to realize that there aren’t any ancient sources of hidden knowledge, and that a logician today is really much more worth listening to than anything in Aristotle. (See, e.g. his comments here).
So the argument is that they are all mentioned here and that some people subscribe to some or all of them? Sure, and some people here like the Red Sox. And a lot of others think that studying pure math is fun. And we have a sizable fraction that thinks that everyone should know how to use Spivak pronons. The presence of certain classes of opinions in some fraction of a population doesn’t necessarily tell you very much.
Only some of them thought this, others thought that they were the superior gods and had been tricked by the demiurge to give up their positions of power. Still others thought that their was no superior god but that they needed to make or become the superior god.
Prexisting is not the same as ancient. The gnostics themselves had to have come up with the ideas of gnosticism without someone previously having come up with that exact idea before. Eliezer has claimed to have seen the need for a friendly AI that goes foom, hence for this thing he is a guide. He claims to have come to this conclusion logically, which the rationality that allows for the use of logic was preexisting within himself. No one else previously needs to have thought of the need for friendly AI (with the notable exception of Science Fiction writers...).
I am fine with using one for undetermined gender. I am not going to start using Spivak pronouns.
I am not arguing that everyone is part of this particular religious group. I believe that almost all forms of transhumanism are religious, but not all of them follow the same patterns.
Yes ok. But it seems clear that the most common form didn’t. So right now, we’re cherry picking specific parts of a vague transhumanist cluster and then attaching them to a specific cherry picked gnostic beliefs. Do you see why that might not be persuasive?
Granted. But the point still holds. None of what Eliezer has said claims to have anything to do with deep pre-existing sources of knowledge.
Missing my point. I don’t care what pronouns you use. The point is that there are a lot of non-standard beliefs that are common here and some standard beliefs too. You can’t just pick the specific set that you think most resembles a religion and act like those are the relevant dominant beliefs. Or rather you can, but it isn’t very productive if you are trying to answer some question of the form “Does the cluster of beliefs common among users at Less Wrong resemble what is generally called a religion?”
Given that Confucianism is not a religion then as currently constituted neither Less Wrong or Transhumanism is in general a religion. Although, the Singularity2045 people are one if it is more then one guy being very enthusiastic about things.
Unless you want to argue that Less Wrong is a religion, that isn’t helpful. These are still separate things. As it happens most people here endorse both of them—this isn’t a coincidence, the transhumanism is a result of the rationalism—but in general they are still separate. In other places too you seem to have conflated transhumanism in general or rationalism in general with Less Wrong in particular; do you think most transhumanists have ever heard of Eliezer’s sequences? Nor do they necessarily expect a technological singularity or that cryonics will work.
Hm—now that I look at your original comment, you actually wrote
But from there on out you’ve just referred to “transhumanism” rather than this website. Please stop doing this; use the proper terms for what you’re referring to so people actually know what you’re talking about.
Would you rather I use “Less Wrong”ism?
If that’s what you mean, yes.
Having defined that I was restricting the comments to Transhumanism as expressed on this site I saw no reason to continue restating the issue. Also, many, but not all, of the points brought up are not restricted to transhumanism as found on this site, as has been established with in this discussion, but to transhumanism in general.
The problem is that using one term to mean something closely related is bound to cause confusion even if you explicitly redefine it at the outset. I would really advise against doing that. And you weren’t even entirely explicit about it; you wrote “Transhumanism, at least as expressed on this site, is as far as I can tell a religion” and then used “transhumanism” thereafter. It would have been clearer had you written something like “Transhumanism, at least as expressed on this site (which hereafter I will just refer to as transhumanism)”, but even then I would advise against it for the reason above. (Especially because people often join these discussions in the middle.)
Added the parenthesis. It is too late in the discussion to go through and relabel everything in my opinion. Doing so would also require more of an explanation of what was being talked about in my opinion.
OK, thanks, that’s at least a little helpful. I’ll disappear from this discussion now.
EDIT: To clarify, this is because really I think this whole discussion is pointless and as TheOtherDave and Vladimir Nesov have pointed out, for reasons that should be clear if you’ve read Eliezer’s 37 ways that words can be wrong. :)
I have read 37 ways that words can be wrong. Your argument is that I violated 20, 20 I feel is a valid point so I attempted to fix it. Not all of the points on that list are valid in my opinion. For instance it is necessary to agree on a definition of what is a religion for my argument to make sense and I have attempted to present that argument in detail in my response to TheOtherDave. I have also attempted to show how this discussion is not pointless, but if you disagree with me then we will have to agree to disagree, which I can do because I don’t accept Eliezer as an authority figure.
For what it is worth, there may be issues with #9 and #11 also.
Yes; #11 was what I considered to be the problem here. I wasn’t thinking about “defying common usage without a reason” as something where the problem was nonobvious; though he happens to have written about it, referring to the sequences for that would be a cannon-to-kill-a-mosquito sort of thing and didn’t even occur to me.
It might be closer to a Pseudoreligion.
I thought I already answered the issue with 11?
Which doesn’t seem to be a term you’ve defined at all.
Some of them. I’m confused with some of your apparent answers to that so I’m not completely sure. It may be a failing on my part.
Which is why I haven’t brought it up before, I would say go look it up but then I would be violating a few more of the items on that list. It is also much harder to point to an example and say this is a pseudoreligion, but not a religion and not just some other form of association (at least it is harder for me).
Yes, I’ve seen the term before. The reason I’m asking you to define it if you are going to use it is because like the term “religion” it has different meanings in different contexts when different people are using it (although as far as I can tell most people use it to mean “recent religion that I don’t like” in a way similar to how some people use the term “cult”.) So without expanding out precisely what you mean it isn’t a helpful term.
As far as I can tell a pseudoreligion is a religion that hasn’t coalesced as of yet into a distinct set of shared beliefs. That is how I would use the term.
However, this contradicts many of the ways that it is used generally, which seem to match your view of how people use the term. I am therefore not sure that it is helpful term given the common usage and connotations to that usage. Cult is similarly a difficult word, it is useful in a technical sense to define the worship of something but commonly has a very different meaning.
Professional golfers also satisfy points 1, 7, 8 and 10.
Belief in the elect: They believe that they are better golfers than others, and that only they are worthy of competing in the PGA Championship; lesser mortals must be relegated to less prestigious competitions and will be kept out of the clubhouse by armed security guards.
Belief that everyone who disagrees with them is a heretic: They laugh at and mock people who think that you should use a 9-iron when your ball is on the green, or that you should start off with a putter on a par-5 hole. These people are automatically assumed to be less skilled at golf than they are.
Sacred Symbols: They frequently carry around bags of golf clubs and wear special clothing; these can spark conversations with non-golfers.
Holy Script: Every two years, the Royal and Ancient Golf Club of St. Andrews publishes a book called “The Rules of Golf”; this is accorded sacred status and anyone who refuses to follow it is expelled from the brotherhood of professional golfers in disgrace.
I maintain that insofar as we tick any of these boxes, we do so for reasons more like the reasons professional golfers tick them than the reasons the Catholic Church ticks them.
2,3,4,5,6,9 take a lot more explanation as they are unusual beliefs among secular groups, but all we can try to do is justify why we believe them. Have you tried reading the Sequences?
Although note that some sociologists have argued that modern sports do in fact act functionally like religions, although that claim is mainly made about the sports fans. So I’m not sure this is helpful, other than to suggest that religion is such a vague supercategory that noting that something has aspects of it just isn’t helpful.
I have read the sequences, which is why I brought this up. My statements have either specific posts or specific comments in mind from the sequences.
Hypothetically, if someone were to invent an immortality pill which was scientifically proven to work (I am not asserting that this is possible) then would you describe anybody who believed this pill worked as having converted to a new religion? Assuming here that the claim that this pill offers immortality is no more controversial than the claim that paracetamol relieves headaches.
Immortality that is proven to work entails a contradiction due to the 2nd law of Thermodynamics, as well as being able to know the pill works only to the age of the oldest person that has taken the pill with no deaths that are not suicide or accident. Therefore, given the impossibility of proving the pill works to provide immortality, I would consider the belief that the pill has conferred immortality to be a religious belief but not necessarily a religion in itself.
If it has been shown that the pill does increase the life expectancy dramatically with near certainty then I would take the pill under the expectation of living longer but not being immortal.
Fair enough, suppose the pill only increases life expectancy. In that case would you say anyone who believes it works is religious.
Most transhumanists also don’t necessarily believe in living forever, they just want to live for a very long time. My point was that if the evidence suggests this is possible, it is not a symptom of religious faith to believe that it is possible. If you disagree, the correct approach is to discuss the evidence, not go throwing ad hominems around.
If the pill is only claimed to increase life expectancy and has been shown to do so then it is not religious. If it has not been shown to do so then it is in effect religious in nature (or snake oil, depending on who believes what and what the pill actually does).
I fully expect life expectancies to rise and the quality of life as we age to also increase. I think that this is in agreement with science as we currently understand it. However, this is not what is claimed on this site. Terms like millions of years or more are given with great fervor even though there isn’t any evidence of that being possible.
I wasn’t giving a literal example. My point was that “transhumanists believe X, some religions believe Y, which is similar to X, therefore transhumanism is a religion” is a stupid argument. After all, Abrahamic religions and transhumanists agree that snow is white, but nobody would consider that a good argument.
If you wish to criticise the beliefs held by transhumanists you must criticise the beliefs directly, not reason by analogy to superficially similar beliefs held by other groups.
I just did for the example you gave. Did you want me to do so for all of the examples given on my list?
Since many of the examples on your list are things that transhumanists don’t believe, or things that only some people using the name believe, I’m not necessarily interested. I already know that literally eternal lifespans are impossible within current physics, for example.
I’m still unsure about the millions of years thing, I don’t know of any physical principle that prohibits it so I’m not willing to rule it out just yet. Also, actuarial escape velocity only requires that that life expectancies rise, which you say you expect, and that they do so at a certain rate.
If you want to discuss this further then find someone else. Your post which started this about “transhumanists believe X, some religions believe Y, which is similar to X, therefore transhumanism is a religion” makes me very sceptical of the possibility that we will reach agreement.
Well, these “Abrahamic” traits could be taken to describe the biased epistemological dynamics of various other ingroups.