Eliezer has said that “it seems pretty obvious to me that some point in the not-too-distant future we’re going to build an AI [...] it will be a superintelligence relative to us [...] in one to ten decades and probably on the lower side of that.” ---- http://bloggingheads.tv/diavlogs/21857
The vast majority of very smart and accomplished people (e.g. Nobel prize winners in sciences, Fields medalists, founders of large tech corporations) do not subscribe to the view that the “singularity is near.” This raises a strong possibility that people like Eliezer who think that it’s pretty obvious that “the singularity is near” are deluded for the same reason that the 9-11 Truthers are. As Yvain says, it’s a boost to one’s self esteem to feel that one has “figured out a deep and important secret that the rest of the world is too complacent to realize.”
Has there been any discussion of this matter in the Less Wrong archives?
Most unpopular beliefs are false. However, if everyone subscribed to strict majoritarianism and never took up unpopular beliefs, intellectual progress would cease completely. There must come a point at which cost we pay in wasted effort because of false unpopular beliefs is worth the payoff in progress through new ideas, which of course all start off unpopular. So while I’d like 9-11 truthers to see the error of their beliefs, I’d like to achieve that through argument based on fact, rather than through simply pointing out that everyone disagrees with them.
Also, of course, strict majoritarianism is self-defeating, since it’s a pretty unpopular stance in itself.
People could (at least in principle) entertain and advocate for unpopular beliefs without actually believing them. (I think Robin Hanson wrote a post about this in the early days of OB.)
Also, of course, strict majoritarianism is self-defeating, since it’s a pretty unpopular stance in itself.
Variations on this theme have certainly come up. This site says it’s about rationality, yet the local consensus is weird or deviant, what if that’s being produced by the very same irrationality mechanisms that you all write about? Lots of people have posed that question.
With respect to your comparison: The idea that this will be the century of artificial intelligence is commonplace now. Silicon Valley has not quite become Singularity Valley, but it is extremely common for people who work in the computer industry, even very senior ones, to now anticipate a future that is radically science-fictional in character. It would only be a small number of your “very smart and accomplished people” who even have a considered opinion, pro or con, on Eliezer’s specific philosophy, but I don’t think his statement that you quote is especially unusual or anomalous for its time.
You could say something similar about the 9-11 Truthers too—that they are part of the zeitgeist—though in locating their social support base, you’ll find it’s identifiably different to the culture in which singularity ideas are most potent. The generalization is far from universal, but I would say that singularity believers tend to be people from technical or scientific subcultures who feel personally empowered by the rise of technology, whereas 9-11 conspiracy believers are politically and socially minded and feel disempowered by the state of the world.
I should clarify. I did not mean to insult Eliezer—I think that he’s a well intentioned and very brilliant guy. I also was not attempting to advocate majoritarian epistemology. Also, I acknowledge that even if Eliezer is misguided about in his beliefs about his future, there are clearly other possible explanations besides “that other kind of status.”
To refine my question: When one adopts a view which
(a) Deviates from mainstream beliefs
(b) Is flattering to oneself
(c) Is comprehensive in scope and implications
one should be vigilant about the possibility that one is being influenced by desire for “that other kind of status.”
Eliezer’s views about the expected value of SIAI’s activities seem to meet each of criteria (a), (b) and (c) fairly strongly. This does not mean that his views are wrong, but it does make me reluctant to take them very seriously without evidence that he (and others who hold such views) have exhibited a high level of vigilance about being influenced for desire for “that other kind of status” in connection with these views.
Is there anywhere where I can find evidence that Eliezer and others who share his views have exhibited such a high level of vigilance toward possibility of being influenced by a desire for “that other kind of status” in connection with their views about the expected value of SIAI’s activities?
Eliezer may well be off on the time scale. I would guess he’s an order of magnitude off. But an incorrect guess about the timescale of a future event does not give rise to a strong possibility that he’s deluded, like the 9-11 Truthers, for ego reasons. Downvoted, because this reads more like an insult than a reasoned question.
Eliezer has said that “it seems pretty obvious to me that some point in the not-too-distant future we’re going to build an AI [...] it will be a superintelligence relative to us [...] in one to ten decades and probably on the lower side of that.” ---- http://bloggingheads.tv/diavlogs/21857
The vast majority of very smart and accomplished people (e.g. Nobel prize winners in sciences, Fields medalists, founders of large tech corporations) do not subscribe to the view that the “singularity is near.” This raises a strong possibility that people like Eliezer who think that it’s pretty obvious that “the singularity is near” are deluded for the same reason that the 9-11 Truthers are. As Yvain says, it’s a boost to one’s self esteem to feel that one has “figured out a deep and important secret that the rest of the world is too complacent to realize.”
Has there been any discussion of this matter in the Less Wrong archives?
Most unpopular beliefs are false. However, if everyone subscribed to strict majoritarianism and never took up unpopular beliefs, intellectual progress would cease completely. There must come a point at which cost we pay in wasted effort because of false unpopular beliefs is worth the payoff in progress through new ideas, which of course all start off unpopular. So while I’d like 9-11 truthers to see the error of their beliefs, I’d like to achieve that through argument based on fact, rather than through simply pointing out that everyone disagrees with them.
Also, of course, strict majoritarianism is self-defeating, since it’s a pretty unpopular stance in itself.
People could (at least in principle) entertain and advocate for unpopular beliefs without actually believing them. (I think Robin Hanson wrote a post about this in the early days of OB.)
Yep.
Variations on this theme have certainly come up. This site says it’s about rationality, yet the local consensus is weird or deviant, what if that’s being produced by the very same irrationality mechanisms that you all write about? Lots of people have posed that question.
With respect to your comparison: The idea that this will be the century of artificial intelligence is commonplace now. Silicon Valley has not quite become Singularity Valley, but it is extremely common for people who work in the computer industry, even very senior ones, to now anticipate a future that is radically science-fictional in character. It would only be a small number of your “very smart and accomplished people” who even have a considered opinion, pro or con, on Eliezer’s specific philosophy, but I don’t think his statement that you quote is especially unusual or anomalous for its time.
You could say something similar about the 9-11 Truthers too—that they are part of the zeitgeist—though in locating their social support base, you’ll find it’s identifiably different to the culture in which singularity ideas are most potent. The generalization is far from universal, but I would say that singularity believers tend to be people from technical or scientific subcultures who feel personally empowered by the rise of technology, whereas 9-11 conspiracy believers are politically and socially minded and feel disempowered by the state of the world.
I should clarify. I did not mean to insult Eliezer—I think that he’s a well intentioned and very brilliant guy. I also was not attempting to advocate majoritarian epistemology. Also, I acknowledge that even if Eliezer is misguided about in his beliefs about his future, there are clearly other possible explanations besides “that other kind of status.”
To refine my question: When one adopts a view which
(a) Deviates from mainstream beliefs
(b) Is flattering to oneself
(c) Is comprehensive in scope and implications
one should be vigilant about the possibility that one is being influenced by desire for “that other kind of status.”
Eliezer’s views about the expected value of SIAI’s activities seem to meet each of criteria (a), (b) and (c) fairly strongly. This does not mean that his views are wrong, but it does make me reluctant to take them very seriously without evidence that he (and others who hold such views) have exhibited a high level of vigilance about being influenced for desire for “that other kind of status” in connection with these views.
Is there anywhere where I can find evidence that Eliezer and others who share his views have exhibited such a high level of vigilance toward possibility of being influenced by a desire for “that other kind of status” in connection with their views about the expected value of SIAI’s activities?
[Edited for formatting.]
Eliezer may well be off on the time scale. I would guess he’s an order of magnitude off. But an incorrect guess about the timescale of a future event does not give rise to a strong possibility that he’s deluded, like the 9-11 Truthers, for ego reasons. Downvoted, because this reads more like an insult than a reasoned question.