Psychotic “delusions” are more about holding certain genres of idea with a socially inappropriate amount of intensity and obsession than holding a false idea. Lots of non-psychotic people hold false beliefs (eg religious people). And, interestingly, it is absolutely possible to hold a true belief in a psychotic way.
I have observed people during psychotic episodes get obsessed with the idea that social media was sending them personalized messages (quite true; targeted ads are real) or the idea that the nurses on the psych ward were lying to them (they were).
Preoccupation with the revelation of secret knowledge, with one’s own importance, with mistrust of others’ motives, and with influencing others’ thoughts or being influenced by other’s thoughts, are classic psychotic themes.
And it can be a symptom of schizophrenia when someone’s mind gets disproportionately drawn to those themes. This is called being “paranoid” or “grandiose.”
But sometimes (and I suspect more often with more intelligent/self-aware people) the literal content of their paranoid or grandiose beliefs is true!
sometimes the truth really has been hidden!
sometimes people really are lying to you or trying to manipulate you!
sometimes you really are, in some ways, important! sometimes influential people really are paying attention to you!
of course people influence each others’ thoughts—not through telepathy but through communication!
a false psychotic-flavored thought is “they put a chip in my brain that controls my thoughts.” a true psychotic-flavored thought is “Hollywood moviemakers are trying to promote progressive values in the public by implanting messages in their movies.”
These thoughts can come from the same emotional drive, they are drawn from dwelling on the same theme of “anxiety that one’s own thoughts are externally influenced”, they are in a deep sense mere arbitrary verbal representations of a single mental phenomenon...
but if you take the content literally, then clearly one claim is true and one is false.
and a sufficiently smart/self-aware person will feel the “anxiety-about-mental-influence” experience, will search around for a thought that fits that vibe but is also true, and will come up with something a lot more credible than “they put a mind-control chip in my brain”, but is fundamentally coming from the same motive.
There’s an analogous but easier to recognize thing with depression.
A depressed person’s mind is unusually drawn to obsessing over bad things. But this obviously doesn’t mean that no bad things are real or that no depressive’s depressing claims are true.
When a depressive literally believes they are already dead, we call that Cotard’s Delusion, a severe form of psychotic depression. When they say “everybody hates me” we call it a mere “distorted thought”. When they talk accurately about the heat death of the universe we call it “thermodynamics.” But it’s all coming from the same emotional place.
In general, mental illnesses, and mental states generally, provide a “tropism” towards thoughts that fit with certain emotional/aesthetic vibes.
Depression makes you dwell on thoughts of futility and despair
Anxiety makes you dwell on thoughts of things that can go wrong
Mania makes you dwell on thoughts of yourself as powerful or on the extreme importance of whatever you’re currently doing
Paranoid psychosis makes you dwell on thoughts of mistrust, secrets, and influencing/being influenced
You can, to some extent, “filter” your thoughts (or the ones you publicly express) by insisting that they make sense. You still have a bias towards the emotional “vibe” you’re disposed to gravitate towards; but maybe you don’t let absurd claims through your filter even if they fit the vibe. Maybe you grudgingly admit the truth of things that don’t fit the vibe but technically seem correct.
this does not mean that the underlying “tropism” or “bias” does not exist!!!
this does not mean that you believe things “only because they are true”!
in a certain sense, you are doing the exact same thing as the more overtly irrational person, just hiding it better!
the “bottom line” in terms of vibe has already been written, so it conveys no “updates” about the world
the “bottom line” in terms of details may still be informative because you’re checking that part and it’s flexible
“He’s not wrong but he’s still crazy” is a valid reaction to someone who seems to have a mental-illness-shaped tropism to their preoccupations.
eg if every post he writes, on a variety of topics, is negative and gloomy, then maybe his conclusions say more about him than about the truth concerning the topic;
he might still be right about some details but you shouldn’t update too far in the direction of “maybe I should be gloomy about this too”
Conversely, “this sounds like a classic crazy-person thought, but I still separately have to check whether it’s true” is also a valid and important move to make (when the issue is important enough to you that the extra effort is worth it).
Just because someone has a mental illness doesn’t mean every word out of their mouth is false!
(and of course this assumption—that “crazy” people never tell the truth—drives a lot of psychiatric abuse.)
I once saw a video on Instagram of a psychiatrist recommending to other psychiatrists that they purchase ear scopes to check out their patients’ ears, because: 1. Apparently it is very common for folks with severe mental health issues to imagine that there is something in their ear (e.g., a bug, a listening device) 2. Doctors usually just say “you are wrong, there’s nothing in your ear” without looking 3. This destroys trust, so he started doing cursory checks with an ear scope 4. Far more often than he expected (I forget exactly, but something like 10-20%ish), there actually was something in the person’s ear—usually just earwax buildup, but occasionally something else like a dead insect—that was indeed causing the sensation, and he gained a clinical pathway to addressing his patients’ discomfort that he had previously lacked
It’s pretty far from meeting dath ilan’s standard though; in fact an x-ray would be more than sufficient as anyone capable of putting something in someone’s ear would obviously vastly prefer to place it somewhere harder to check, whereas nobody would be capable of defeating an x-ray machine as metal parts are unavoidable.
This concern pops up in books on the Cold War (employees at every org and every company regularly suffer from mental illnesses at somewhere around their base rates, but things get complicated at intelligence agencies where paranoid/creative/adversarial people are rewarded and even influence R&D funding) and an x-ray machine cleanly resolved the matter every time.
Schizophrenia is the archetypal definitely-biological mental disorder, but recently for reasons relevant to the above, I’ve been wondering if that is wrong/confused. Here’s my alternate (admittedly kinda uninformed) model:
Psychosis is a biological state or neural attractor, which we can kind of symptomatically characterize, but which really can only be understood at a reductionistic level.
One of the symptoms/consequences of psychosis is getting extreme ideas at extreme amounts of intensity.
This symptom/consequence then triggers a variety of social dynamics that give classic schizophrenic-like symptoms such as, as you say, “preoccupation with the revelation of secret knowledge, with one’s own importance, with mistrust of others’ motives, and with influencing others’ thoughts or being influenced by other’s thoughts”
That is, if you suddenly get an extreme idea (e.g. that the fly that flapped past you is a sign from god that you should abandon your current life), you would expect dynamics like:
People get concerned for you and try to dissuade you, likely even conspiring in private to do so (and even if they’re not conspiring, it can seem like a conspiracy). In response, it might seem appropriate to distrust them.
Or, if one interprets it as them just lacking the relevant information, one needs to develop some theory of why one has access to special information that they don’t.
Or, if one is sympathetic to their concern, it would be logical to worry about one’s thoughts getting influenced.
But these sorts of dynamics can totally be triggered by extreme beliefs without psychosis! This might also be related to how Enneagram type 5 (the rationalist type) is especially prone to schizophrenia-like symptoms.
(When I think “in a psychotic way”, I think of the neurological disorder, but it seems like the way you use it in your comment is more like the schizophrenia-like social dynamic?)
In general, mental illnesses, and mental states generally, provide a “tropism” towards thoughts that fit with certain emotional/aesthetic vibes.
Depression makes you dwell on thoughts of futility and despair
Anxiety makes you dwell on thoughts of things that can go wrong
Mania makes you dwell on thoughts of yourself as powerful or on the extreme importance of whatever you’re currently doing
Paranoid psychosis makes you dwell on thoughts of mistrust, secrets, and influencing/being influenced
Also tangential, this is sort of a “general factor” model of mental states. That often seems applicable, but recently my default interpretation of factor models has been that they tend to get at intermediary variables and not root causes.
Let’s take an analogy with computer programs. If you look at the correlations in which sorts of processes run fast or slow, you might find a broad swathe of processes whose performance is highly correlated, because they are all predictably CPU-bound. However, when these processes are running slow, there will usually be some particular program that is exhausting the CPU and preventing the others from running. This problematic program can vary massively from computer to computer, so it is hard to predict or model in general, but often easy to identify in the particular case by looking at which program is most extreme.
Thank you, this is interesting and important. I worry that it overstates similarity of different points on a spectrum, though.
in a certain sense, you are doing the exact same thing as the more overtly irrational person, just hiding it better!
In a certain sense, yes. In other, critical senses, no. This is a case where quantitative differences are big enough to be qualitative. When someone is clinically delusional, there are a few things which distinguish it from the more common wrong ideas. Among them, the inability to shut up about it when it’s not relevant, and the large negative impact on relationships and daily life. For many many purposes, “hiding it better” is the distinction that matters.
I fully agree that “He’s not wrong but he’s still crazy” is valid (though I’d usually use less-direct phrasing). It’s pretty rare that “this sounds like a classic crazy-person thought, but I still separately have to check whether it’s true” happens to me, but it’s definitely not never.
For a while I ended up spending a lot of time thinking about specifically the versions of the idea where I couldn’t easily tell how true they were… which I suppose I do think is the correct place to be paying attention to?
One has to be a bit careful with this though. E.g. someone experiencing or having experienced harassment may have a seemingly pathological obsession on the circumstances and people involved in the situation, but it may be completely proportional to the way that it affected them—it only seems pathological to people who didn’t encounter the same issues.
If it’s not serving them, it’s pathological by definition, right?
So obsessing about exactly those circumstances and types of people could be pathological if it’s done more than will protect them in the future, weighing in the emotional cost of all that obsessing.
Of course we can’t just stop patterns of thought as soon as we decide they’re pathological. But deciding it doesn’t serve me so I want to change it is a start.
Yes, it’s proportional to the way it affected them—but most of the effect is in the repetition of thoughts about the incident and fear of future similar experiences. Obsessing about unpleasant events is natural, but it often seems pretty harmful itself.
Trauma is a horrible thing. There’s a delicate balance between supporting someone’s right and tendency to obsess over their trauma while also supporting their ability to quit re-traumatizing themselves by simulating their traumatic event repeatedly.
If it’s not serving them, it’s pathological by definition, right?
This seems way too strong, otherwise any kind of belief or emotion that is not narrowly in pursuit of your goals is pathological.
I completely agree that it’s important to strike a balance between revisiting the incident and moving on.
but most of the effect is in the repetition of thoughts about the incident and fear of future similar experiences.
This seems partially wrong. The thoughts are usually consequences of the damage that is done, and they can be unhelpful in their own right, but they are not usually the problem. E.g. if you know that X is an abuser and people don’t believe you, I wouldn’t go so far as saying your mental dissonance about it is the problem.
Some psychiatry textbooks classify “overvalued ideas” as distinct from psychotic delusions.
Depending on how wide you make the definition, a whole rag-bag of diagnoses from the DSM V are overvalued ideas (e.g, anorexia nervosa over valuing being fat).
Psychotic “delusions” are more about holding certain genres of idea with a socially inappropriate amount of intensity and obsession than holding a false idea. Lots of non-psychotic people hold false beliefs (eg religious people). And, interestingly, it is absolutely possible to hold a true belief in a psychotic way.
I have observed people during psychotic episodes get obsessed with the idea that social media was sending them personalized messages (quite true; targeted ads are real) or the idea that the nurses on the psych ward were lying to them (they were).
Preoccupation with the revelation of secret knowledge, with one’s own importance, with mistrust of others’ motives, and with influencing others’ thoughts or being influenced by other’s thoughts, are classic psychotic themes.
And it can be a symptom of schizophrenia when someone’s mind gets disproportionately drawn to those themes. This is called being “paranoid” or “grandiose.”
But sometimes (and I suspect more often with more intelligent/self-aware people) the literal content of their paranoid or grandiose beliefs is true!
sometimes the truth really has been hidden!
sometimes people really are lying to you or trying to manipulate you!
sometimes you really are, in some ways, important! sometimes influential people really are paying attention to you!
of course people influence each others’ thoughts—not through telepathy but through communication!
a false psychotic-flavored thought is “they put a chip in my brain that controls my thoughts.” a true psychotic-flavored thought is “Hollywood moviemakers are trying to promote progressive values in the public by implanting messages in their movies.”
These thoughts can come from the same emotional drive, they are drawn from dwelling on the same theme of “anxiety that one’s own thoughts are externally influenced”, they are in a deep sense mere arbitrary verbal representations of a single mental phenomenon...
but if you take the content literally, then clearly one claim is true and one is false.
and a sufficiently smart/self-aware person will feel the “anxiety-about-mental-influence” experience, will search around for a thought that fits that vibe but is also true, and will come up with something a lot more credible than “they put a mind-control chip in my brain”, but is fundamentally coming from the same motive.
There’s an analogous but easier to recognize thing with depression.
A depressed person’s mind is unusually drawn to obsessing over bad things. But this obviously doesn’t mean that no bad things are real or that no depressive’s depressing claims are true.
When a depressive literally believes they are already dead, we call that Cotard’s Delusion, a severe form of psychotic depression. When they say “everybody hates me” we call it a mere “distorted thought”. When they talk accurately about the heat death of the universe we call it “thermodynamics.” But it’s all coming from the same emotional place.
In general, mental illnesses, and mental states generally, provide a “tropism” towards thoughts that fit with certain emotional/aesthetic vibes.
Depression makes you dwell on thoughts of futility and despair
Anxiety makes you dwell on thoughts of things that can go wrong
Mania makes you dwell on thoughts of yourself as powerful or on the extreme importance of whatever you’re currently doing
Paranoid psychosis makes you dwell on thoughts of mistrust, secrets, and influencing/being influenced
You can, to some extent, “filter” your thoughts (or the ones you publicly express) by insisting that they make sense. You still have a bias towards the emotional “vibe” you’re disposed to gravitate towards; but maybe you don’t let absurd claims through your filter even if they fit the vibe. Maybe you grudgingly admit the truth of things that don’t fit the vibe but technically seem correct.
this does not mean that the underlying “tropism” or “bias” does not exist!!!
this does not mean that you believe things “only because they are true”!
in a certain sense, you are doing the exact same thing as the more overtly irrational person, just hiding it better!
the “bottom line” in terms of vibe has already been written, so it conveys no “updates” about the world
the “bottom line” in terms of details may still be informative because you’re checking that part and it’s flexible
“He’s not wrong but he’s still crazy” is a valid reaction to someone who seems to have a mental-illness-shaped tropism to their preoccupations.
eg if every post he writes, on a variety of topics, is negative and gloomy, then maybe his conclusions say more about him than about the truth concerning the topic;
he might still be right about some details but you shouldn’t update too far in the direction of “maybe I should be gloomy about this too”
Conversely, “this sounds like a classic crazy-person thought, but I still separately have to check whether it’s true” is also a valid and important move to make (when the issue is important enough to you that the extra effort is worth it).
Just because someone has a mental illness doesn’t mean every word out of their mouth is false!
(and of course this assumption—that “crazy” people never tell the truth—drives a lot of psychiatric abuse.)
link: https://roamresearch.com/#/app/srcpublic/page/71kfTFGmK
I once saw a video on Instagram of a psychiatrist recommending to other psychiatrists that they purchase ear scopes to check out their patients’ ears, because:
1. Apparently it is very common for folks with severe mental health issues to imagine that there is something in their ear (e.g., a bug, a listening device)
2. Doctors usually just say “you are wrong, there’s nothing in your ear” without looking
3. This destroys trust, so he started doing cursory checks with an ear scope
4. Far more often than he expected (I forget exactly, but something like 10-20%ish), there actually was something in the person’s ear—usually just earwax buildup, but occasionally something else like a dead insect—that was indeed causing the sensation, and he gained a clinical pathway to addressing his patients’ discomfort that he had previously lacked
This reminds me of dath ilan’s hallucination diagnosis from page 38 of Yudkowsky and Alicorn’s glowfic But Hurting People Is Wrong.
It’s pretty far from meeting dath ilan’s standard though; in fact an x-ray would be more than sufficient as anyone capable of putting something in someone’s ear would obviously vastly prefer to place it somewhere harder to check, whereas nobody would be capable of defeating an x-ray machine as metal parts are unavoidable.
This concern pops up in books on the Cold War (employees at every org and every company regularly suffer from mental illnesses at somewhere around their base rates, but things get complicated at intelligence agencies where paranoid/creative/adversarial people are rewarded and even influence R&D funding) and an x-ray machine cleanly resolved the matter every time.
Tangential, but...
Schizophrenia is the archetypal definitely-biological mental disorder, but recently for reasons relevant to the above, I’ve been wondering if that is wrong/confused. Here’s my alternate (admittedly kinda uninformed) model:
Psychosis is a biological state or neural attractor, which we can kind of symptomatically characterize, but which really can only be understood at a reductionistic level.
One of the symptoms/consequences of psychosis is getting extreme ideas at extreme amounts of intensity.
This symptom/consequence then triggers a variety of social dynamics that give classic schizophrenic-like symptoms such as, as you say, “preoccupation with the revelation of secret knowledge, with one’s own importance, with mistrust of others’ motives, and with influencing others’ thoughts or being influenced by other’s thoughts”
That is, if you suddenly get an extreme idea (e.g. that the fly that flapped past you is a sign from god that you should abandon your current life), you would expect dynamics like:
People get concerned for you and try to dissuade you, likely even conspiring in private to do so (and even if they’re not conspiring, it can seem like a conspiracy). In response, it might seem appropriate to distrust them.
Or, if one interprets it as them just lacking the relevant information, one needs to develop some theory of why one has access to special information that they don’t.
Or, if one is sympathetic to their concern, it would be logical to worry about one’s thoughts getting influenced.
But these sorts of dynamics can totally be triggered by extreme beliefs without psychosis! This might also be related to how Enneagram type 5 (the rationalist type) is especially prone to schizophrenia-like symptoms.
(When I think “in a psychotic way”, I think of the neurological disorder, but it seems like the way you use it in your comment is more like the schizophrenia-like social dynamic?)
Also tangential, this is sort of a “general factor” model of mental states. That often seems applicable, but recently my default interpretation of factor models has been that they tend to get at intermediary variables and not root causes.
Let’s take an analogy with computer programs. If you look at the correlations in which sorts of processes run fast or slow, you might find a broad swathe of processes whose performance is highly correlated, because they are all predictably CPU-bound. However, when these processes are running slow, there will usually be some particular program that is exhausting the CPU and preventing the others from running. This problematic program can vary massively from computer to computer, so it is hard to predict or model in general, but often easy to identify in the particular case by looking at which program is most extreme.
Thank you, this is interesting and important. I worry that it overstates similarity of different points on a spectrum, though.
In a certain sense, yes. In other, critical senses, no. This is a case where quantitative differences are big enough to be qualitative. When someone is clinically delusional, there are a few things which distinguish it from the more common wrong ideas. Among them, the inability to shut up about it when it’s not relevant, and the large negative impact on relationships and daily life. For many many purposes, “hiding it better” is the distinction that matters.
I fully agree that “He’s not wrong but he’s still crazy” is valid (though I’d usually use less-direct phrasing). It’s pretty rare that “this sounds like a classic crazy-person thought, but I still separately have to check whether it’s true” happens to me, but it’s definitely not never.
I imagine they were obsessed with false versions of this idea, rather than obsession about targeted advertising?
no! it sounded like “typical delusion stuff” at first until i listened carefully and yep that was a description of targeted ads.
For a while I ended up spending a lot of time thinking about specifically the versions of the idea where I couldn’t easily tell how true they were… which I suppose I do think is the correct place to be paying attention to?
One has to be a bit careful with this though. E.g. someone experiencing or having experienced harassment may have a seemingly pathological obsession on the circumstances and people involved in the situation, but it may be completely proportional to the way that it affected them—it only seems pathological to people who didn’t encounter the same issues.
If it’s not serving them, it’s pathological by definition, right?
So obsessing about exactly those circumstances and types of people could be pathological if it’s done more than will protect them in the future, weighing in the emotional cost of all that obsessing.
Of course we can’t just stop patterns of thought as soon as we decide they’re pathological. But deciding it doesn’t serve me so I want to change it is a start.
Yes, it’s proportional to the way it affected them—but most of the effect is in the repetition of thoughts about the incident and fear of future similar experiences. Obsessing about unpleasant events is natural, but it often seems pretty harmful itself.
Trauma is a horrible thing. There’s a delicate balance between supporting someone’s right and tendency to obsess over their trauma while also supporting their ability to quit re-traumatizing themselves by simulating their traumatic event repeatedly.
This seems way too strong, otherwise any kind of belief or emotion that is not narrowly in pursuit of your goals is pathological.
I completely agree that it’s important to strike a balance between revisiting the incident and moving on.
This seems partially wrong. The thoughts are usually consequences of the damage that is done, and they can be unhelpful in their own right, but they are not usually the problem. E.g. if you know that X is an abuser and people don’t believe you, I wouldn’t go so far as saying your mental dissonance about it is the problem.
Some psychiatry textbooks classify “overvalued ideas” as distinct from psychotic delusions.
Depending on how wide you make the definition, a whole rag-bag of diagnoses from the DSM V are overvalued ideas (e.g, anorexia nervosa over valuing being fat).