FWIW, here’s a rough sense of where the sequence is going:
There are very different ways of seeing the world.
It’s a generally important life skill to understand when other people are seeing the world differently.
It is (probably) also an important life skill to be able to see the world in a few different ways. (not necessarily any particular “few different” ways – there’s an experiential shift that comes from having had to learn to see a couple different ways that I think is really necessary for resolving disagreement and conflict and coexisting). Put another way, being able to “hold your frame as object.”
This seems really important for humanity generally (for cosmopolitan coexistence)
This seems differently important for the rationality community.
I think of LessWrong as “about the gears frame”, and that’s fine (and quite good). But the natural ways of seeing the gears frame tend to leave people sort of stuck or blind. LessWrong is about building a single-not-compartmentalized probabilistic model of the universe, but this needs to include the parts of the universe that gears-oriented-people tend not to notice as readily.”
It’s important to resolve the question “how do we have high epistemic standards about things that aren’t in the gears frame, or that can’t be made explicit enough to share between multiple people with different gear-frames.”
I don’t understand the question in the last point. I am being intentionally stupid and simple, what reason do you have to guess/believe that epistemic standards would be harder to apply to non-gear frames?
The motivating example there is “how to have high epistemic standards around introspection, since introspection isn’t publicly verifiable, but is also fairly important to practical rationality.” I think this is at least somewhat hard, and separate from being hard, it’s a domain that doesn’t have agreed upon rules.
(I realize it might be not be clear how that sentence followed from the previous point, since there’s nothing intrinsically non-gearsy about introspection. The issue is something like “at least some of the people exploring introspection techniques are also coming at it from fairly different frames, which can’t easily all communicate with each other. From the outside, it’s hard to tell the difference between ‘a person is wrong about facts’ and ‘a person’s frame is foreign.’”)
So the connection is “The straightforward way to increase epistemological competence is to talk about beliefs in detail. In introspection it is hard to apply this method because details can’t be effectively shared to get an understanding”. It seems to me it is not about gear-frames being special but that frames have preconditions to get them to work and an area that allows/permits a lot of frames makes it hard to hit any frames prequisities.
FWIW, here’s a rough sense of where the sequence is going:
There are very different ways of seeing the world.
It’s a generally important life skill to understand when other people are seeing the world differently.
It is (probably) also an important life skill to be able to see the world in a few different ways. (not necessarily any particular “few different” ways – there’s an experiential shift that comes from having had to learn to see a couple different ways that I think is really necessary for resolving disagreement and conflict and coexisting). Put another way, being able to “hold your frame as object.”
This seems really important for humanity generally (for cosmopolitan coexistence)
This seems differently important for the rationality community.
I think of LessWrong as “about the gears frame”, and that’s fine (and quite good). But the natural ways of seeing the gears frame tend to leave people sort of stuck or blind. LessWrong is about building a single-not-compartmentalized probabilistic model of the universe, but this needs to include the parts of the universe that gears-oriented-people tend not to notice as readily.”
It’s important to resolve the question “how do we have high epistemic standards about things that aren’t in the gears frame, or that can’t be made explicit enough to share between multiple people with different gear-frames.”
These seems pretty good and I think your current approach might suffice for this.
I don’t understand the question in the last point. I am being intentionally stupid and simple, what reason do you have to guess/believe that epistemic standards would be harder to apply to non-gear frames?
The motivating example there is “how to have high epistemic standards around introspection, since introspection isn’t publicly verifiable, but is also fairly important to practical rationality.” I think this is at least somewhat hard, and separate from being hard, it’s a domain that doesn’t have agreed upon rules.
(I realize it might be not be clear how that sentence followed from the previous point, since there’s nothing intrinsically non-gearsy about introspection. The issue is something like “at least some of the people exploring introspection techniques are also coming at it from fairly different frames, which can’t easily all communicate with each other. From the outside, it’s hard to tell the difference between ‘a person is wrong about facts’ and ‘a person’s frame is foreign.’”)
So the connection is “The straightforward way to increase epistemological competence is to talk about beliefs in detail. In introspection it is hard to apply this method because details can’t be effectively shared to get an understanding”. It seems to me it is not about gear-frames being special but that frames have preconditions to get them to work and an area that allows/permits a lot of frames makes it hard to hit any frames prequisities.