In my paradigm, human minds are made of something I call “microcognitive elements”, which are the “worker ants” or “worker bees” of the mind. They are “primed”/tasked with certain high-level ideas and concepts, and try to “massage”/lubricate the mental gears into both using these concepts effectively (action/cognition) and to interpret things in terms of these concepts (perception) The “differential” that is applied by microcognitive elements to make your models work, is not necessarily related to those models and may in fact be opposed to them (compensating for, or ignoring, the ways these models don’t fit with the world)
Rationality is not necessarily about truth. Rationality is a “cognitive program” for the microcognitive elements. Some parts of the program may be “functionally”/”strategically”/”deliberately” framing things in deceptive ways, in order to have the program work better (for the kind of people it works for).
The specific disagreements I have with the “rationalist” culture:
The implied statement that LessWrong paradigm has a monopoly on “rationality”, and is “rationality”, rather than an attempted implementation of “rationality”, a set of cognitive strategies based on certain models and assumptions of how human minds work. If “rationality is about winning”, then anyone who is winning is being rational, whether they hold LW-approved beliefs or not.
Denial of nebulosity, fixation on the “imaginary objects” that are the output of the lossy operation of “make things precise so they can be talked about in precise terms”.
All of these things have computational reasons, and are a part of the cognitive trade-offs the LW memeplex/hive-mind makes due to its “cognitive specialization”. Nevertheless, I believe they are “wrong”, in the sense that they lead to you having an incorrect map/model of reality, while strategically deceiving yourself into believing that you do have a correct model of reality. I also believe they are part of the reason we are currently losing—you are being rational, but you are not being rational enough. Our current trajectory does not result in a winning outcome.
Since reading the sequences, I’ve made much more accurate predictions about the world.
Both the guiding principle of making beliefs pay rent in anticipated experience, as well as the tools by which to acquire those accurate beliefs, have worked for me.
So at an object level, I disagree with your claim. Also, if you’re going to introduce topics like “meta-rationality” and “nebulosity” as part of your disagreement, you kind of have to defend them. You can’t just link a word salad and expect people to engage. The first thing I’m looking for is a quick, one or two paragraph summary of the idea so I can decide whether it’s worth it to pursue further.
In my paradigm, human minds are made of something I call “microcognitive elements”, which are the “worker ants” or “worker bees” of the mind.
They are “primed”/tasked with certain high-level ideas and concepts, and try to “massage”/lubricate the mental gears into both using these concepts effectively (action/cognition) and to interpret things in terms of these concepts (perception)
The “differential” that is applied by microcognitive elements to make your models work, is not necessarily related to those models and may in fact be opposed to them (compensating for, or ignoring, the ways these models don’t fit with the world)
Rationality is not necessarily about truth. Rationality is a “cognitive program” for the microcognitive elements. Some parts of the program may be “functionally”/”strategically”/”deliberately” framing things in deceptive ways, in order to have the program work better (for the kind of people it works for).
The specific disagreements I have with the “rationalist” culture:
The implied statement that LessWrong paradigm has a monopoly on “rationality”, and is “rationality”, rather than an attempted implementation of “rationality”, a set of cognitive strategies based on certain models and assumptions of how human minds work. If “rationality is about winning”, then anyone who is winning is being rational, whether they hold LW-approved beliefs or not.
Almost complete disregard for meta-rationality.
Denial of nebulosity, fixation on the “imaginary objects” that are the output of the lossy operation of “make things precise so they can be talked about in precise terms”.
All of these things have computational reasons, and are a part of the cognitive trade-offs the LW memeplex/hive-mind makes due to its “cognitive specialization”. Nevertheless, I believe they are “wrong”, in the sense that they lead to you having an incorrect map/model of reality, while strategically deceiving yourself into believing that you do have a correct model of reality. I also believe they are part of the reason we are currently losing—you are being rational, but you are not being rational enough.
Our current trajectory does not result in a winning outcome.
Since reading the sequences, I’ve made much more accurate predictions about the world.
Both the guiding principle of making beliefs pay rent in anticipated experience, as well as the tools by which to acquire those accurate beliefs, have worked for me.
So at an object level, I disagree with your claim. Also, if you’re going to introduce topics like “meta-rationality” and “nebulosity” as part of your disagreement, you kind of have to defend them. You can’t just link a word salad and expect people to engage. The first thing I’m looking for is a quick, one or two paragraph summary of the idea so I can decide whether it’s worth it to pursue further.