Apologies that the complexity stuff seems a bit sketchy. I had to cut the strongest the formalisms because it was very slow going developing them and will probably take me months. I have a sketch of a mathematical approach to phenomenology as topology over state space but need time to fully develop it so I can flesh out a rigorous explanation of what I mean by complexity here.
You’re right that this is not especially applicable in this form. It’s like describing topology when what you really need to do is integrate a function over real numbers. But given my hermeneutical preferences I think just trying to understand it is likely to lead to some “leveling up”, as you put it.
Thanks for giving more information about your theory.
I’d like to express skepticism that people would be able to level up after reading such a treatise. Maybe my mental models of past me and other people aren’t very good, but my impression is that giving this sort of stuff to people is not an ideal way to boost people’s mental models.
As in, the act of explicating the whole phenomenon behind breaking down what we mean by concepts, systems, etc. seems very different from the act of giving people the tools or leading them to the point where they can start to update their mental models and explore different systems. (That is to say, the act of writing about Leveling Up seems very different than the things you’d need to do to help someone Level Up.)
Happy to extend this if it turns out that we’ve got differing ideas on the matter.
Sure, my program for helping people achieve more phenomenological complexity is not to point them at this. It’s instead to follow the advice I’ve previously outlined: act into fear and abandon all hope. That can be hard to apply, though, so folks often need to be specifically induced to face particular fears and abandon particular hopes. Once they’ve done this and experienced worldview disintegration they can reintegrate with whatever they like (at least until they have to disintegrate again to make progress) and basically any choice seems fine there since repeated disintegration eventually forces convergence by needing to accept all of reality into the worldview.
So basically my advice is keep breaking down your assumptions and rebuilding them until you have none left. Then you will be enlightened.
Hm, I think the thing I’m trying to point at is my general feeling intuition about how people tend to react (it’s just an internal model here, so feel free to counter w/ better-informed info) which says something like if you want to have people to even get to the point where they can look at written essays on rationality and think to themselves, “Wait this could apply to me!”, you need some sort of baseline of rationality to catalyze the whole thing.
My claim is that getting people to this sort of optimizing step whereupon everything else can work requires something different than what conventional wisdom might dictate (e.g. writing things and/or giving people general advice and telling them to go with it).
Something like “personally interacting with promising individuals and send off social signals that you know cool stuff and pique their interest; then, slowly get them to want to care and get them started off on their journey via slow tidbits / cultivating their interest” seems to be something that I claim is more effective than just finding someone and saying, “Hey! Read this; it’ll shatter your worldview!”
Thanks, I appreciate the summary!
Apologies that the complexity stuff seems a bit sketchy. I had to cut the strongest the formalisms because it was very slow going developing them and will probably take me months. I have a sketch of a mathematical approach to phenomenology as topology over state space but need time to fully develop it so I can flesh out a rigorous explanation of what I mean by complexity here.
You’re right that this is not especially applicable in this form. It’s like describing topology when what you really need to do is integrate a function over real numbers. But given my hermeneutical preferences I think just trying to understand it is likely to lead to some “leveling up”, as you put it.
Thanks for giving more information about your theory.
I’d like to express skepticism that people would be able to level up after reading such a treatise. Maybe my mental models of past me and other people aren’t very good, but my impression is that giving this sort of stuff to people is not an ideal way to boost people’s mental models.
As in, the act of explicating the whole phenomenon behind breaking down what we mean by concepts, systems, etc. seems very different from the act of giving people the tools or leading them to the point where they can start to update their mental models and explore different systems. (That is to say, the act of writing about Leveling Up seems very different than the things you’d need to do to help someone Level Up.)
Happy to extend this if it turns out that we’ve got differing ideas on the matter.
Sure, my program for helping people achieve more phenomenological complexity is not to point them at this. It’s instead to follow the advice I’ve previously outlined: act into fear and abandon all hope. That can be hard to apply, though, so folks often need to be specifically induced to face particular fears and abandon particular hopes. Once they’ve done this and experienced worldview disintegration they can reintegrate with whatever they like (at least until they have to disintegrate again to make progress) and basically any choice seems fine there since repeated disintegration eventually forces convergence by needing to accept all of reality into the worldview.
So basically my advice is keep breaking down your assumptions and rebuilding them until you have none left. Then you will be enlightened.
Hm, I think the thing I’m trying to point at is my general feeling intuition about how people tend to react (it’s just an internal model here, so feel free to counter w/ better-informed info) which says something like if you want to have people to even get to the point where they can look at written essays on rationality and think to themselves, “Wait this could apply to me!”, you need some sort of baseline of rationality to catalyze the whole thing.
My claim is that getting people to this sort of optimizing step whereupon everything else can work requires something different than what conventional wisdom might dictate (e.g. writing things and/or giving people general advice and telling them to go with it).
Something like “personally interacting with promising individuals and send off social signals that you know cool stuff and pique their interest; then, slowly get them to want to care and get them started off on their journey via slow tidbits / cultivating their interest” seems to be something that I claim is more effective than just finding someone and saying, “Hey! Read this; it’ll shatter your worldview!”
I agree: interactively working in person is more effective.