Here’s a quote from Ziz’s post My Journey to the Dark Side.
Reject morality. Never do the right thing because it’s the right thing. Never even think that concept or ask that question unless it’s to model what others will think. And then, always in quotes. Always in quotes and treated as radioactive.
Make the source of sentiment inside you that made you learn to care about what was the right thing express itself some other way.
Here’s a quote from Ziz’s post Neutral and Evil.
If you’re reading this and this is you, I recommend aiming for lawful evil. Keep a strong focus on still being able to coordinate even though you know that’s what you’re doing.
An evil person is typically just a neutral person who has become better at optimizing, more like an unfriendly AI, in that they no longer have to believe their own propaganda. That can be either because they’re consciously lying, really good at speaking in multiple levels with plausible deniability and don’t need to fool anyone anymore, or because their puppetmasters have grown smart enough to be able to reap benefits from defection without getting coordinated against without the conscious mind’s help.
FYI I don’t think this is the right summary. Ziz’s morality is something more like “society’s conception of good is corrupt, therefore you should be prioritizing unlocking yourself from society’s frame”.
They have a bunch of complicated worldview relating to how to do this. I do think they go off the rails, and given how many people ended up either committing suicide or getting involved in ways that seemed to make their life worse, I do not recommend trying to follow their worldview and understand the details of it.
But I wanted to flag this because, if you start with an assumption “Ziz is cartoonishly Star Wars Evil”, and then you start reading any of their content, you might notice “oh wait this isn’t as cartoonishly Star Wars Evil as it sounded at first glance, maybe there’s something to this.” And then you might frogboil yourself into taking it too seriously.
(I might have slightly more nuanced advice on how to relate to Ziz for people I’m having an in-depth 1-1 conversation with, but, this is the default advice I feel good about sharing in a low-fidelity way)
Thanks. I’m imagining a strident infohazard sign warning not to even think about Zizianism.
What is interesting is more than such infohazards are possible. If you think about it, the cheapness of modern communications and rapidly improving AI may allow for the creation of infohazard weapons—media content that often kills humans.
But yeah, thanks for the warning. “Zizianism” is some dangerous stuff I and other mere humans should avoid studying closely. Leave that information in a sealed box.
I’ve read some of the stuff on Ziz’s website. In my experience, there were a few scattered bits here and there that were sensible (those were the things that were basically restatements of views found in plenty of other places, e.g. most of this post—until the last handful of paragraphs, where it goes off the rails—is an insightful analysis of one of the fundamental problems related to NVC and similar techniques… but of course plenty of other people have written about this sort of thing).
The rest was just very obviously wrong and insane. I found myself utterly baffled by the notion that anyone could even be tempted to take any of it seriously, or believe it, etc. My reaction wasn’t “oh no, this sounds disturbingly plausible!”; it was “wow, this is sheer nonsense—the deranged ramblings of a very obviously mentally disturbed individual”.
Now, not everyone reacts to this stuff like I did—obviously! But the right takeaway, I think, isn’t “this ‘Zizianism’ is dangerous, mere humans shouldn’t consider it too closely”. Rather, the takeaway is a question: “what mental quirks make some people incapable of seeing this for the insane absurdity that it is?” Why do some people find this stuff plausible? And: by what means can we identify such tendencies in ourselves, and counteract them?
Ziz’s tendency towards inscrutable metaphors and loaded jargon serve essentially the same purpose as typos in a spam email—it’s meant to filter you out. If the hypnotic language made more literal sense, it would bring in people who weren’t especially susceptible in particular to Ziz’s brand of rhetoric, and thus might not be good recruits for her criminal organization.
it captures the sort of person who gets hooked on tvtropes and who first read LW by chasing hyperlink chains through the sequences at random. It comes off as wrong but in a way that seems somehow intentional, like there’s a thread of something that somehow makes sense of it, that makes the seemingly wrong parts all make sense, it’s just too cohesive but not cohesive enough otherwise, and then you go chasing all those hyperlinks over bolded words through endless glossary pages and anecdotes down this rabbit hole in an attempt to learn the hidden secrets of the multiverse and before you know what’s happened it’s come to dominate all of your thinking. And there is a lot of good content that is helpful mixed in with the bad content that’s harmful, which makes it all the harder to tell which is which.
the other thing that enabled it to get to me was that it was linked to me by someone inside the community who i trusted and who told me it was good content, so i kept trying to take it seriously even though my initial reaction to it was knee-jerk horror. Then later on others kept telling me it was important and that i needed to take it seriously so i kept pushing myself to engage with it until i started compulsively spiraling on it.
it captures the sort of person who gets hooked on tvtropes and who first read LW by chasing hyperlink chains through the sequences at random.
Hmm, no, I don’t think so.
I first read LW (well, it was OB at the time, but same deal) by chasing hyperlink chains through (what would come to be called) the Sequences at random. And I’ve read my share of TV Tropes. So this doesn’t check out.
Whatever the culprit quirk is, it’s clearly got nothing to do with whatever it is that makes people… read things by clicking on hyperlinks from other things.
the other thing that enabled it to get to me was that it was linked to me by someone inside the community who i trusted and who told me it was good content, so i kept trying to take it seriously even though my initial reaction to it was knee-jerk horror. Then later on others kept telling me it was important and that i needed to take it seriously so i kept pushing myself to engage with it until i started compulsively spiraling on it.
Hmm, I see. Would you say that the problem here was something like… too little confidence in your own intuition / too much willingness to trust other people’s assessment? Or something else?
(Did you eventually conclude that the person who recommended Ziz’s writings to you was… wrong? Crazy? Careless about what sorts of things to endorse? Something else?)
Hmm, I see. Would you say that the problem here was something like… too little confidence in your own intuition / too much willingness to trust other people’s assessment? Or something else?
that was definitely a large part of it, i let people sort of ‘epistemically bully’ me for a long time out of the belief that it was the virtuous and rationally correct thing to do. The first person who linked me sinceriously retracted her endorsements of it pretty quickly, but i had already sort of gotten hooked on the content at that point and had no one to actually help steer me out of it so i kept passively flirting with it over time. That was an exploitable hole, and someone eventually found it and exploited me using it for a while in a way that kept me further hooked into the content through this compulsive fear that ziz was wrong but also correct and going to win and that was bad so she had to be stopped.
Did you eventually conclude that the person who recommended Ziz’s writings to you was… wrong? Crazy? Careless about what sorts of things to endorse? Something else?
The person who kept me hooked on her writing for years was in a constant paranoia spiral about AI doom and was engaging with Ziz’s writing as obsessive-compulsive self-harm. They kept me doing that with them for a long time by insisting they had the one true rationality and if i didn’t like it i was just crazy and wrong and that i was lying to myself and that only by trying to be like them could the lightcone be saved from certain doom. I’m not sure what there is to eventually conclude from all of that, other than that it was mad unhealthy on multiple levels.
EDIT: the thing to conclude was that JD was grooming me
something like that. maybe it’d be worth adding that the LW corpus/HPMOR sort of primes you for this kind of mistake by attempting to align reason and passion as closely as possible, thus making ‘reasoning passionately’ an exploitable backdoor.
I believe that reading about Zizianism is not dangerous. Actually meeting Ziz and debating them for a long time is. (Reading is only dangerous indirectly, as it may make you curious.) Kinda like a difference between reading a Scientology book, and joining an actual Scientology organization.
One of the tricks Ziz uses is redefining the meaning of the words (including words such as “good” and “evil”, or even “person”). This works much better if you are overwhelmed, and do not have enough time to track the relations of Zizian jargon with actual words. The trick works—and this is what many cults do—by attaching your cachedconnotations of the old words to the new ones.
*
As an example, imagine the word “good”. If you are like me, you probably do not have an exact definition, but you still have a vague idea that “good” is somehow correlated to helping people and anticorrelated to hurting them. And you probably have a cached thought like “I want to be good (perhaps unless the cost is too high)”.
Now imagine that Ziz gives you a very complicated argument why “good” should be redefined to… something very abstract and complicated, based on many incorrect assumptions… but in effect, not too dissimilar from “obeying Ziz unconditionally”.
The problem is, if your cached thought “I want to be good” automatically attaches to this new meaning of “good” (effectively becoming “I want to obey Ziz unconditionally”). This is more likely to happen if you are tired, for example if Ziz convinces you to do experiments with sleep deprivation while listening to their bullshit philosophy. (It will not happen automatically, but more like, Ziz pressuring you endlessly to accept the new definition of “good”, then telling you “would you rather be good or evil?”, until you tired brain gives up and you say “okay, okay, I want to be good”, and then probably you immediately get told that in order to signal your sincerity about goodness, you have to do X, Y, and Z, otherwise you are an evil hypocrite. -- This is how I imagine it; no personal experience with Ziz, I just know a thing or two about cults in general, so I can complete the pattern.)
It is unlikely to happen if you merely read Ziz’s blog. Because there is no social pressure, no sleep deprivation, no Ziz debugging your objections in real time. At any moment, you are free to conclude “this is bullshit”, and there will be no one screaming at your face.
At any moment, you are free to conclude “this is bullshit”, and there will be no one screaming at your face.
Unless you’ve studied until the screaming comes from within you. In the present context, try the “Morality” section. (For reasons, I designed that page so that the subsections cannot be directly linked to.) And then the “Pure Insanity” section.
I didn’t dream up the contents of that page. I just took ideas that are in the air of the LW/EA/rationalsphere (and some other places, but mostly from there), and simulated “taking ideas seriously” turned up to eleven. It’s intended as a vaccine, not a pathogen, but anyone who may be susceptible to taking ideas seriously might be wise to avoid looking.
I didn’t dream up the contents of that page. I just took ideas that are in the air of the LW/EA/rationalsphere (and some other places, but mostly from there), and simulated “taking ideas seriously” turned up to eleven.
True.
Still it seems to me (maybe I am wrong here) that Ziz actually had to use the sleep deprivation et cetera in order to convince most people to buy the “up to eleven” version. Even people who take ideas more seriously than usual, often seek some kind of social approval before jumping off the deep end.
effectively becoming “I want to obey Ziz unconditionally”
This is very important and subtle. A real leader absolutely must understand that there is such a thing as lack of common knowledge. Anyone who is acting as though the lack of common knowledge is just you being disloyal / intentionally dumb / etc., is trying to be a cult leader.
Thank you for this post, and it suggests how we could know when an AI system is capable of similar manipulation, and when it can’t.
For example, a system that is accessed via a browser tab that forgets everything after a token limit is obviously not capable of this.
However, a system connected to always on home devices or is given privileged access to your OS desktop (for example if ‘cortana’ can render itself as a ghostly human female that is always on top in windows, and is able to manipulate information in microsoft office applications for you) - and the machine has long term state so it can track it’s brainwashing plans, it would be possible.
(it seems obvious that at a certain point, scammers and other hostile organizations would adopt AI for this purpose)
Here’s a quote from Ziz’s post My Journey to the Dark Side.
Here’s a quote from Ziz’s post Neutral and Evil.
So this morality would be one of “optimized selfishness”? “Do whatever is best for me, knowing that means it will be on balance evil”?
The morality of Bernie Madoff?
Interesting.
FYI I don’t think this is the right summary. Ziz’s morality is something more like “society’s conception of good is corrupt, therefore you should be prioritizing unlocking yourself from society’s frame”.
They have a bunch of complicated worldview relating to how to do this. I do think they go off the rails, and given how many people ended up either committing suicide or getting involved in ways that seemed to make their life worse, I do not recommend trying to follow their worldview and understand the details of it.
But I wanted to flag this because, if you start with an assumption “Ziz is cartoonishly Star Wars Evil”, and then you start reading any of their content, you might notice “oh wait this isn’t as cartoonishly Star Wars Evil as it sounded at first glance, maybe there’s something to this.” And then you might frogboil yourself into taking it too seriously.
(I might have slightly more nuanced advice on how to relate to Ziz for people I’m having an in-depth 1-1 conversation with, but, this is the default advice I feel good about sharing in a low-fidelity way)
Thanks. I’m imagining a strident infohazard sign warning not to even think about Zizianism.
What is interesting is more than such infohazards are possible. If you think about it, the cheapness of modern communications and rapidly improving AI may allow for the creation of infohazard weapons—media content that often kills humans.
But yeah, thanks for the warning. “Zizianism” is some dangerous stuff I and other mere humans should avoid studying closely. Leave that information in a sealed box.
To offer a contrasting viewpoint:
I’ve read some of the stuff on Ziz’s website. In my experience, there were a few scattered bits here and there that were sensible (those were the things that were basically restatements of views found in plenty of other places, e.g. most of this post—until the last handful of paragraphs, where it goes off the rails—is an insightful analysis of one of the fundamental problems related to NVC and similar techniques… but of course plenty of other people have written about this sort of thing).
The rest was just very obviously wrong and insane. I found myself utterly baffled by the notion that anyone could even be tempted to take any of it seriously, or believe it, etc. My reaction wasn’t “oh no, this sounds disturbingly plausible!”; it was “wow, this is sheer nonsense—the deranged ramblings of a very obviously mentally disturbed individual”.
Now, not everyone reacts to this stuff like I did—obviously! But the right takeaway, I think, isn’t “this ‘Zizianism’ is dangerous, mere humans shouldn’t consider it too closely”. Rather, the takeaway is a question: “what mental quirks make some people incapable of seeing this for the insane absurdity that it is?” Why do some people find this stuff plausible? And: by what means can we identify such tendencies in ourselves, and counteract them?
Ziz’s tendency towards inscrutable metaphors and loaded jargon serve essentially the same purpose as typos in a spam email—it’s meant to filter you out. If the hypnotic language made more literal sense, it would bring in people who weren’t especially susceptible in particular to Ziz’s brand of rhetoric, and thus might not be good recruits for her criminal organization.
it captures the sort of person who gets hooked on tvtropes and who first read LW by chasing hyperlink chains through the sequences at random. It comes off as wrong but in a way that seems somehow intentional, like there’s a thread of something that somehow makes sense of it, that makes the seemingly wrong parts all make sense, it’s just too cohesive but not cohesive enough otherwise, and then you go chasing all those hyperlinks over bolded words through endless glossary pages and anecdotes down this rabbit hole in an attempt to learn the hidden secrets of the multiverse and before you know what’s happened it’s come to dominate all of your thinking. And there is a lot of good content that is helpful mixed in with the bad content that’s harmful, which makes it all the harder to tell which is which.
the other thing that enabled it to get to me was that it was linked to me by someone inside the community who i trusted and who told me it was good content, so i kept trying to take it seriously even though my initial reaction to it was knee-jerk horror. Then later on others kept telling me it was important and that i needed to take it seriously so i kept pushing myself to engage with it until i started compulsively spiraling on it.
Hmm, no, I don’t think so.
I first read LW (well, it was OB at the time, but same deal) by chasing hyperlink chains through (what would come to be called) the Sequences at random. And I’ve read my share of TV Tropes. So this doesn’t check out.
Whatever the culprit quirk is, it’s clearly got nothing to do with whatever it is that makes people… read things by clicking on hyperlinks from other things.
Hmm, I see. Would you say that the problem here was something like… too little confidence in your own intuition / too much willingness to trust other people’s assessment? Or something else?
(Did you eventually conclude that the person who recommended Ziz’s writings to you was… wrong? Crazy? Careless about what sorts of things to endorse? Something else?)
that was definitely a large part of it, i let people sort of ‘epistemically bully’ me for a long time out of the belief that it was the virtuous and rationally correct thing to do. The first person who linked me sinceriously retracted her endorsements of it pretty quickly, but i had already sort of gotten hooked on the content at that point and had no one to actually help steer me out of it so i kept passively flirting with it over time. That was an exploitable hole, and someone eventually found it and exploited me using it for a while in a way that kept me further hooked into the content through this compulsive fear that ziz was wrong but also correct and going to win and that was bad so she had to be stopped.
The person who kept me hooked on her writing for years was in a constant paranoia spiral about AI doom and was engaging with Ziz’s writing as obsessive-compulsive self-harm. They kept me doing that with them for a long time by insisting they had the one true rationality and if i didn’t like it i was just crazy and wrong and that i was lying to myself and that only by trying to be like them could the lightcone be saved from certain doom. I’m not sure what there is to eventually conclude from all of that, other than that it was mad unhealthy on multiple levels.
EDIT: the thing to conclude was that JD was grooming me
I see, thank you.
Insufficient defence of the passions against reason, then?
something like that. maybe it’d be worth adding that the LW corpus/HPMOR sort of primes you for this kind of mistake by attempting to align reason and passion as closely as possible, thus making ‘reasoning passionately’ an exploitable backdoor.
I believe that reading about Zizianism is not dangerous. Actually meeting Ziz and debating them for a long time is. (Reading is only dangerous indirectly, as it may make you curious.) Kinda like a difference between reading a Scientology book, and joining an actual Scientology organization.
One of the tricks Ziz uses is redefining the meaning of the words (including words such as “good” and “evil”, or even “person”). This works much better if you are overwhelmed, and do not have enough time to track the relations of Zizian jargon with actual words. The trick works—and this is what many cults do—by attaching your cached connotations of the old words to the new ones.
*
As an example, imagine the word “good”. If you are like me, you probably do not have an exact definition, but you still have a vague idea that “good” is somehow correlated to helping people and anticorrelated to hurting them. And you probably have a cached thought like “I want to be good (perhaps unless the cost is too high)”.
Now imagine that Ziz gives you a very complicated argument why “good” should be redefined to… something very abstract and complicated, based on many incorrect assumptions… but in effect, not too dissimilar from “obeying Ziz unconditionally”.
The problem is, if your cached thought “I want to be good” automatically attaches to this new meaning of “good” (effectively becoming “I want to obey Ziz unconditionally”). This is more likely to happen if you are tired, for example if Ziz convinces you to do experiments with sleep deprivation while listening to their bullshit philosophy. (It will not happen automatically, but more like, Ziz pressuring you endlessly to accept the new definition of “good”, then telling you “would you rather be good or evil?”, until you tired brain gives up and you say “okay, okay, I want to be good”, and then probably you immediately get told that in order to signal your sincerity about goodness, you have to do X, Y, and Z, otherwise you are an evil hypocrite. -- This is how I imagine it; no personal experience with Ziz, I just know a thing or two about cults in general, so I can complete the pattern.)
It is unlikely to happen if you merely read Ziz’s blog. Because there is no social pressure, no sleep deprivation, no Ziz debugging your objections in real time. At any moment, you are free to conclude “this is bullshit”, and there will be no one screaming at your face.
Unless you’ve studied until the screaming comes from within you. In the present context, try the “Morality” section. (For reasons, I designed that page so that the subsections cannot be directly linked to.) And then the “Pure Insanity” section.
I didn’t dream up the contents of that page. I just took ideas that are in the air of the LW/EA/rationalsphere (and some other places, but mostly from there), and simulated “taking ideas seriously” turned up to eleven. It’s intended as a vaccine, not a pathogen, but anyone who may be susceptible to taking ideas seriously might be wise to avoid looking.
True.
Still it seems to me (maybe I am wrong here) that Ziz actually had to use the sleep deprivation et cetera in order to convince most people to buy the “up to eleven” version. Even people who take ideas more seriously than usual, often seek some kind of social approval before jumping off the deep end.
This is very important and subtle. A real leader absolutely must understand that there is such a thing as lack of common knowledge. Anyone who is acting as though the lack of common knowledge is just you being disloyal / intentionally dumb / etc., is trying to be a cult leader.
Thank you for this post, and it suggests how we could know when an AI system is capable of similar manipulation, and when it can’t.
For example, a system that is accessed via a browser tab that forgets everything after a token limit is obviously not capable of this.
However, a system connected to always on home devices or is given privileged access to your OS desktop (for example if ‘cortana’ can render itself as a ghostly human female that is always on top in windows, and is able to manipulate information in microsoft office applications for you) - and the machine has long term state so it can track it’s brainwashing plans, it would be possible.
(it seems obvious that at a certain point, scammers and other hostile organizations would adopt AI for this purpose)