In a fixed mindset, people believe their basic qualities, like their intelligence or talent, are simply fixed traits. In a growth mindset, people believe that their basic qualities can be modified gradually via dedication and incremental progress. Scott Alexander has cast some doubt on the benefits of growth mindset, but I think it still has merit, if only because it is closer to the truth.
Growth mindset is a good thing that doesn’t do enough. The situation calls for More Dakka. I present: Singularity Mindset.
In a Singularity Mindset, people believe they are self-modifying intelligences well past the singularity threshold, that their basic qualities can be multiplied by large constants by installing the right algorithms and bug fixes, and that even the best optimized people are nowhere near hardware limitations. People who apply Singularity Mindset come to life with startling rapidity.
The seed of this post was planted in my head by Mind vs. Machine, which I read as a call-to-arms to be radically better humans, or at least better conversationalists. The seed sprouted when I noticed the other day that I’ve already blogged more words in 2018 (today is January 18) than in 2017.
Apparently, Kurzweil beat me to the name with Singularity University and Exponential Mindset, but (a) that’s geared towards businesses and technology instead of individuals, and (b) I’m agnostic about the exact shape of the foom, so I’ll stick to Singularity Mindset.
AI Worries
A tiny note of confusion noticed but unresolved can, over the course of years, fester into an all-consuming possession. One such question possessing me has come to a head:
Why is Eliezer Yudkowsky so much more worried about AI risk than I am?
I came up with a standard response rapidly, which goes something like: after years of steadfastly correcting his own biases, acquiring information, and thinking clearly about the world’s biggest problems, he came to the right conclusion.
It’s a good explanation which provokes in me at least a pretense of updating models, but today I will entertain another narrative.
I think the difference between us is that Eliezer has the lived experience of Singularity Mindset, of deliberately self-modifying to the point of becoming unrecognizably intelligent and productive, and the simultaneous lived experience of seeing his own values drift and require extraordinary effort to keep in line.
Meanwhile, I’ve been plodding up the incremental steps of the Temple of Growth Mindset, humbly and patiently picking up the habits and mental hygiene that arise organically.
And so the difference between our worries about AI-risk might be described as us individually typical-minding an AI. Eliezer’s System 1 says, “If AI minds grow up the way I grew up, boy are we in trouble.” My System 1 says, “Nah, we’ll be fine.”
Take Ideas Seriously
Singularity Mindset is taking ideas seriously:
I took Jordan Peterson’s advice seriously and cleaned my room. Turns out I actually love making the bed and decorating. A print of Kandinsky’s Composition VIII is the best thing that happened to my room.
I went to see Wicked on Broadway, took it as seriously as possible, and was mesmerized. I ended up bawling my eyes out for the entire second hour.
I took More Dakka seriously and doubled the amount I blog every day until it became physically fatiguing.
I took my own advice about Babble and constrained writing seriously and wrote three short stories sharing the same dialogue.
Great ideas are not just data points. They are (at bare minimum) algorithms, software updates for your upcoming Singularity. To integrate them properly so that they become yours—to take them with the seriousness they deserve—requires not just an local update on the map, but at very least the design of a new cognitive submodule. In all likelihood, to get maximum mileage out of an idea requires a full-stack restructuring of the mind from the map down to the perceptual structures.
Take this quote of Jung’s that I treasure:
Modern men cannot find God because they will not look low enough.
(More and more I find myself in the absurd position of writing on the ideas of this man who I find impossible to read directly but from whom I have derived such wisdom via second-hand sources.)
It is an injunction to be humble in directions orthogonal to the eighth virtue. To take Jung seriously deserves its own post, but in brief I read this quote in at least three directions.
Look low enough by focusing your mental energy on things that seem beneath you. Feed yourself properly, fix your sleep, and get exercise. Perhaps the most important thing you could be doing with your extraordinary intellectual capacity is to resolve the dysfunction within your immediate family. Perhaps the most important thing you could be writing involves repeating, summarizing, and coming up with catchy concept handles for the ideas of better men and women. Whatever it is, take it seriously, do it properly, and only good will come of it.
Look low enough by confronting the darkness in your personal hell. There are shadows there you instinctively convulse away from: horror movies, historical nightmares, the homeless on the street. Perhaps the most important thing you could be doing is admitting the existence of and then mastering your sadistic urges and delusions of genocide that lie just under the surface, ready to flood forth at the least opportune moment. Perhaps the demon is instead a spiral of anxiety and self-doubt that sends you into sobbing fits in the fetal position. What you need in your life is exactly where you least want to look. Wield your attention against the darkness whenever you have the slack. Only light can defeat shadow.
Look low enough by looking to your inner child for guidance. Oftentimes, progress curves look like “naive, cynical, naive but wise”:
If you’ve plateaued for a long time in the cynical stage, look low enough by reconstituting your inner child. Relinquish your cynicism with the same quickness with which you relinquished your naiveté. Despite your “better” judgment, trust and forgive people. Feel small when you stand besides the ocean. Babble like a baby. Try stupid shit.
Taking ideas seriously is terrifying. It requires that at the drop of a hat, you are willing to extend such charity to a casual remark as to rebuild your whole mental machine on it if it proves true.
Extraordinary people take ideas with extraordinary seriousness. I will read a paper by skimming the abstract. “Huh, that sounds vaguely true.” Scott Alexander will read the paper and write three detailed criticisms, each longer than the paper itself. Me on the other hand, in the last five years I’ve read more words in Scott’s book reviews than in books themselves. What I’m after is that gripping but elusive experience of watching a mind take ideas seriously and completely synthesize them into a vast ocean of knowledge.
Is there a deep truth that caught your fancy recently, that you toss around with your friends the way Slytherins toss Remembralls? You thought it through once and you think you’ve done your due diligence?
Take that idea seriously. Reorganize your mind and life around it. Travel the world looking for examples of it. At very least, write a thousand words about it. God knows I want to hear about it.
Singularity Mindset
Link post
In a fixed mindset, people believe their basic qualities, like their intelligence or talent, are simply fixed traits. In a growth mindset, people believe that their basic qualities can be modified gradually via dedication and incremental progress. Scott Alexander has cast some doubt on the benefits of growth mindset, but I think it still has merit, if only because it is closer to the truth.
Growth mindset is a good thing that doesn’t do enough. The situation calls for More Dakka. I present: Singularity Mindset.
In a Singularity Mindset, people believe they are self-modifying intelligences well past the singularity threshold, that their basic qualities can be multiplied by large constants by installing the right algorithms and bug fixes, and that even the best optimized people are nowhere near hardware limitations. People who apply Singularity Mindset come to life with startling rapidity.
The seed of this post was planted in my head by Mind vs. Machine, which I read as a call-to-arms to be radically better humans, or at least better conversationalists. The seed sprouted when I noticed the other day that I’ve already blogged more words in 2018 (today is January 18) than in 2017.
Apparently, Kurzweil beat me to the name with Singularity University and Exponential Mindset, but (a) that’s geared towards businesses and technology instead of individuals, and (b) I’m agnostic about the exact shape of the foom, so I’ll stick to Singularity Mindset.
AI Worries
A tiny note of confusion noticed but unresolved can, over the course of years, fester into an all-consuming possession. One such question possessing me has come to a head:
Why is Eliezer Yudkowsky so much more worried about AI risk than I am?
I came up with a standard response rapidly, which goes something like: after years of steadfastly correcting his own biases, acquiring information, and thinking clearly about the world’s biggest problems, he came to the right conclusion.
It’s a good explanation which provokes in me at least a pretense of updating models, but today I will entertain another narrative.
I think the difference between us is that Eliezer has the lived experience of Singularity Mindset, of deliberately self-modifying to the point of becoming unrecognizably intelligent and productive, and the simultaneous lived experience of seeing his own values drift and require extraordinary effort to keep in line.
Meanwhile, I’ve been plodding up the incremental steps of the Temple of Growth Mindset, humbly and patiently picking up the habits and mental hygiene that arise organically.
And so the difference between our worries about AI-risk might be described as us individually typical-minding an AI. Eliezer’s System 1 says, “If AI minds grow up the way I grew up, boy are we in trouble.” My System 1 says, “Nah, we’ll be fine.”
Take Ideas Seriously
Singularity Mindset is taking ideas seriously:
I took Jordan Peterson’s advice seriously and cleaned my room. Turns out I actually love making the bed and decorating. A print of Kandinsky’s Composition VIII is the best thing that happened to my room.
I went to see Wicked on Broadway, took it as seriously as possible, and was mesmerized. I ended up bawling my eyes out for the entire second hour.
I took More Dakka seriously and doubled the amount I blog every day until it became physically fatiguing.
I took my own advice about Babble and constrained writing seriously and wrote three short stories sharing the same dialogue.
Great ideas are not just data points. They are (at bare minimum) algorithms, software updates for your upcoming Singularity. To integrate them properly so that they become yours—to take them with the seriousness they deserve—requires not just an local update on the map, but at very least the design of a new cognitive submodule. In all likelihood, to get maximum mileage out of an idea requires a full-stack restructuring of the mind from the map down to the perceptual structures.
Take this quote of Jung’s that I treasure:
Modern men cannot find God because they will not look low enough.
(More and more I find myself in the absurd position of writing on the ideas of this man who I find impossible to read directly but from whom I have derived such wisdom via second-hand sources.)
It is an injunction to be humble in directions orthogonal to the eighth virtue. To take Jung seriously deserves its own post, but in brief I read this quote in at least three directions.
Look low enough by focusing your mental energy on things that seem beneath you. Feed yourself properly, fix your sleep, and get exercise. Perhaps the most important thing you could be doing with your extraordinary intellectual capacity is to resolve the dysfunction within your immediate family. Perhaps the most important thing you could be writing involves repeating, summarizing, and coming up with catchy concept handles for the ideas of better men and women. Whatever it is, take it seriously, do it properly, and only good will come of it.
Look low enough by confronting the darkness in your personal hell. There are shadows there you instinctively convulse away from: horror movies, historical nightmares, the homeless on the street. Perhaps the most important thing you could be doing is admitting the existence of and then mastering your sadistic urges and delusions of genocide that lie just under the surface, ready to flood forth at the least opportune moment. Perhaps the demon is instead a spiral of anxiety and self-doubt that sends you into sobbing fits in the fetal position. What you need in your life is exactly where you least want to look. Wield your attention against the darkness whenever you have the slack. Only light can defeat shadow.
Look low enough by looking to your inner child for guidance. Oftentimes, progress curves look like “naive, cynical, naive but wise”:
For mathematicians, the curve is pre-rigor, rigor, post-rigor.
Picasso said, “It took me four years to paint like Raphael, but a lifetime to paint like a child.”
Scott Alexander foretold that idealism is the new cynicism.
Knowing about biases can hurt you.
If you’ve plateaued for a long time in the cynical stage, look low enough by reconstituting your inner child. Relinquish your cynicism with the same quickness with which you relinquished your naiveté. Despite your “better” judgment, trust and forgive people. Feel small when you stand besides the ocean. Babble like a baby. Try stupid shit.
Taking ideas seriously is terrifying. It requires that at the drop of a hat, you are willing to extend such charity to a casual remark as to rebuild your whole mental machine on it if it proves true.
Extraordinary people take ideas with extraordinary seriousness. I will read a paper by skimming the abstract. “Huh, that sounds vaguely true.” Scott Alexander will read the paper and write three detailed criticisms, each longer than the paper itself. Me on the other hand, in the last five years I’ve read more words in Scott’s book reviews than in books themselves. What I’m after is that gripping but elusive experience of watching a mind take ideas seriously and completely synthesize them into a vast ocean of knowledge.
Is there a deep truth that caught your fancy recently, that you toss around with your friends the way Slytherins toss Remembralls? You thought it through once and you think you’ve done your due diligence?
Take that idea seriously. Reorganize your mind and life around it. Travel the world looking for examples of it. At very least, write a thousand words about it. God knows I want to hear about it.