Basing your ethics on far-fetched notions like “intergalactic civilizations” and the “Singularity” is the purest example of science fiction delusion. I would characterize this movement as an exercise in collective delusion—very much like any other religion. Which I don’t have a problem with, as long as you don’t take your delusions too seriously and start thinking you have a holy mission to save the universe from the heathens. Unfortunately, that is exactly the sense I get from Mr. Yudkowsky and some of his more fanatical followers...
See, this is one of the predictions people get totally wrong when they try to interpret singularity activism using religion as a template. It’s not “saving the universe from the heathens” its “optimizing the universe on behalf of everyone, even people who are foolish, shortsighted, and/or misinformed”.
Well formed criticism (even if mean-spirited or uncharitable) is very useful, because it helps identify problems that can be corrected once recognized, and it reduces the likelihood of an insanity spiral due to people agree with each other as a form of monkey grooming while trying to work together effectively. Poorly formed criticism is just noise.
You should talk more about the detailed mechanisms and processes of magical thinking, and less about trivial pattern patches to science fiction.
Please spare me your “optimizations on my behalf” and refrain from telling me what I should talk about. Your language gives you away—it’s the same old grandiose totalitarian mindset in a new guise. Are these criticisms well-formed enough for you?
Fears of competitive exclusion and loss of autonomy seem entirely reasonable issues to be raised by anyone who thoughtfully considers the status quo trajectory of exponential technological improvements and looming resource limitations. However, it seems to me that singularitarians are generally aiming to honestly and ethical respond to these concerns, rather than actively doing something that would make the concerns more pressing.
If this isn’t clear to people who know enough to troll with as much precision and familiarity as you, then I’d guess that something might be going wrong somewhere. Can you imagine something that you could see in this community that would allay some of your political concerns? What would constitute counter evidence for future scenarios whose prospects make you unhappy?
(Emotional level note: Please upgrade your politeness level. I’ve been rude earlier, but escalating is a bad move even then; I’m de-escalating now. Your current politeness level is generating signs of treating debate as a conflict, and of trolling.)
Basing your ethics
Can you clarify that phrase? I can only parse it can “deriving your ethics from”, but ethical systems are derived from everyday observations like “Hey, it seems bad when people die”, then reasoning about it. Then the ethics exist, and “intergalactic civilizations are desirable” come from them.
Maybe you meant “designating those notions as the most desirable things”? They are consequences of the ethical system, yeah, but “The thing you desire most is impossible”, while bad news, is no reason to change what you desire. (Which is why I called sour grapes.)
delusion
You seem to confuse “A positive Singularity is desirable” (valuing lives, ethical systems) and “A positive Singularity is likely” (pattern-matching with sci-fi).
science fiction delusion
You are invoking the absurdity heuristic. “Intergalactic civilizations and singularities pattern-match science fiction, rather than newspapers.” This isn’t bad if you need a three-second judgement, but is quite faillible (e.g., relativity, interracial marriage, atheism). It would be better to engage with the meat of the argument (why smarter-than-human intelligence is possible in principle, why AIs go flat or FOOM, why the stakes are high, why a supercritical AI is likely in practice (I don’t actually know that one)), pinpoint something in particular, and say “That can’t possibly be right” (backing it up with a model, a set of historical observations, or a gut feeling).
[religious vocabulary]
It’s common knowledge on LW that both the rationality thing (LW) and the AI things (SIAI) are at unusually high risk of becoming cultish. If you can point to a particular problem, please do so; but reasoning by analogy (“They believe weird things, so do religions, therefore they’re like a religion”) proves little. (You know what else contained carbon? HITLER!)
you have a holy mission to save the universe
Are we talking feasibility, or desirability?
If feasibility, again, please point out specific problems. Or alternate ways to save the universe (incrementally, maybe, with charity for the poorest like VillageReach, or specialized research like SENS). Or more urgent risks to address (“civilization crumbles”). Or reasons why all avenues for big change are closed, and actions that might possibly slightly increase the probability of improving the world a little.
If desirability, well, yeah. People are dying. I need to stop that. Sure, it’s hubris and reaching above myself, sure I’m going to waste a lot of money on the equivalent of alchemy and then do it again on the next promising project (and maybe get outright scammed at some point), sure after all that I’m going to fail anyway, but, you know, Amy is dead and that shouldn’t happen to anyone else.
Because honest debaters can think they’re matching each other’s politeness level and go from “Hey, you have a bug there” to “Choke on a buckets of cock”. If AlphaOmega refuses to de-escalate, or if ey still looks like a troll when polite, I’ll shrug and walk away.
“Yo momma’s a cultist” is worthless, but be wary of ignoring all dissenters—evaporative cooling happens. (OTOH, Usenet.)
Edit: Aaand yup, ey’s an ass. Oh well, that’ll teach me a lesson.
Trolls serve an important function in the memetic ecology. We are the antibodies against outbreaks of ideological insanity and terminal groupthink. I’ve developed an entire philosophy of trolling, and am obligated to engage in it as a kind of personal jihad.
That site is obsolete. I create new sites every few months to reflect my current coordinates within the Multiverse of ideas. I am in the process of launching new “Multiversalism” memes which you can find at seanstrange.blogspot.com
There is no Universal truth system. In the language of cardinal numbers, Nihilism = 0, Universalism = 1, and Multiversalism = infinity.
Basing your ethics on far-fetched notions like “intergalactic civilizations” and the “Singularity” is the purest example of science fiction delusion. I would characterize this movement as an exercise in collective delusion—very much like any other religion. Which I don’t have a problem with, as long as you don’t take your delusions too seriously and start thinking you have a holy mission to save the universe from the heathens. Unfortunately, that is exactly the sense I get from Mr. Yudkowsky and some of his more fanatical followers...
See, this is one of the predictions people get totally wrong when they try to interpret singularity activism using religion as a template. It’s not “saving the universe from the heathens” its “optimizing the universe on behalf of everyone, even people who are foolish, shortsighted, and/or misinformed”.
Well formed criticism (even if mean-spirited or uncharitable) is very useful, because it helps identify problems that can be corrected once recognized, and it reduces the likelihood of an insanity spiral due to people agree with each other as a form of monkey grooming while trying to work together effectively. Poorly formed criticism is just noise.
You should talk more about the detailed mechanisms and processes of magical thinking, and less about trivial pattern patches to science fiction.
Charitable interpretation: “the heathens” == “AGI developers who don’t care about Friendliness”.
Please spare me your “optimizations on my behalf” and refrain from telling me what I should talk about. Your language gives you away—it’s the same old grandiose totalitarian mindset in a new guise. Are these criticisms well-formed enough for you?
Yes, thank you, that’s much more precise :-)
Fears of competitive exclusion and loss of autonomy seem entirely reasonable issues to be raised by anyone who thoughtfully considers the status quo trajectory of exponential technological improvements and looming resource limitations. However, it seems to me that singularitarians are generally aiming to honestly and ethical respond to these concerns, rather than actively doing something that would make the concerns more pressing.
If this isn’t clear to people who know enough to troll with as much precision and familiarity as you, then I’d guess that something might be going wrong somewhere. Can you imagine something that you could see in this community that would allay some of your political concerns? What would constitute counter evidence for future scenarios whose prospects make you unhappy?
(Emotional level note: Please upgrade your politeness level. I’ve been rude earlier, but escalating is a bad move even then; I’m de-escalating now. Your current politeness level is generating signs of treating debate as a conflict, and of trolling.)
Can you clarify that phrase? I can only parse it can “deriving your ethics from”, but ethical systems are derived from everyday observations like “Hey, it seems bad when people die”, then reasoning about it. Then the ethics exist, and “intergalactic civilizations are desirable” come from them.
Maybe you meant “designating those notions as the most desirable things”? They are consequences of the ethical system, yeah, but “The thing you desire most is impossible”, while bad news, is no reason to change what you desire. (Which is why I called sour grapes.)
You seem to confuse “A positive Singularity is desirable” (valuing lives, ethical systems) and “A positive Singularity is likely” (pattern-matching with sci-fi).
You are invoking the absurdity heuristic. “Intergalactic civilizations and singularities pattern-match science fiction, rather than newspapers.” This isn’t bad if you need a three-second judgement, but is quite faillible (e.g., relativity, interracial marriage, atheism). It would be better to engage with the meat of the argument (why smarter-than-human intelligence is possible in principle, why AIs go flat or FOOM, why the stakes are high, why a supercritical AI is likely in practice (I don’t actually know that one)), pinpoint something in particular, and say “That can’t possibly be right” (backing it up with a model, a set of historical observations, or a gut feeling).
It’s common knowledge on LW that both the rationality thing (LW) and the AI things (SIAI) are at unusually high risk of becoming cultish. If you can point to a particular problem, please do so; but reasoning by analogy (“They believe weird things, so do religions, therefore they’re like a religion”) proves little. (You know what else contained carbon? HITLER!)
Are we talking feasibility, or desirability?
If feasibility, again, please point out specific problems. Or alternate ways to save the universe (incrementally, maybe, with charity for the poorest like VillageReach, or specialized research like SENS). Or more urgent risks to address (“civilization crumbles”). Or reasons why all avenues for big change are closed, and actions that might possibly slightly increase the probability of improving the world a little.
If desirability, well, yeah. People are dying. I need to stop that. Sure, it’s hubris and reaching above myself, sure I’m going to waste a lot of money on the equivalent of alchemy and then do it again on the next promising project (and maybe get outright scammed at some point), sure after all that I’m going to fail anyway, but, you know, Amy is dead and that shouldn’t happen to anyone else.
Right, so why feed him?
Because honest debaters can think they’re matching each other’s politeness level and go from “Hey, you have a bug there” to “Choke on a buckets of cock”. If AlphaOmega refuses to de-escalate, or if ey still looks like a troll when polite, I’ll shrug and walk away.
“Yo momma’s a cultist” is worthless, but be wary of ignoring all dissenters—evaporative cooling happens. (OTOH, Usenet.)
Edit: Aaand yup, ey’s an ass. Oh well, that’ll teach me a lesson.
Trolls serve an important function in the memetic ecology. We are the antibodies against outbreaks of ideological insanity and terminal groupthink. I’ve developed an entire philosophy of trolling, and am obligated to engage in it as a kind of personal jihad.
According to the web site linked in your profile, you are attempting to actively poison the memetic ecology by automated means. I’m not sure how to answer that, given that the whole site goes far over the top with comic book villainy, except to say that this particular brand of satire is probably dangerous to your mental health.
That site is obsolete. I create new sites every few months to reflect my current coordinates within the Multiverse of ideas. I am in the process of launching new “Multiversalism” memes which you can find at seanstrange.blogspot.com
There is no Universal truth system. In the language of cardinal numbers, Nihilism = 0, Universalism = 1, and Multiversalism = infinity.