Yeah, that sounds about right to me. I think in terms of this framework my claim is primarily “for reasonably complex systems, if you try to do 2 without expertise, you will fail, but you may not realize you have failed”.
I’m also noticing I mean something slightly different by “expertise” than is typically meant. My intended meaning of “expertise” is more like “you have lots of data and observations about the system”, e.g. I think LW self-help stuff is reasonably likely to work (for the LW audience) because people have lots of detailed knowledge and observations about themselves and their friends.
I recently interviewed someone who has a lot of experience predicting systems, and they had 4 steps similar to your two above.
Observe the world and see if it’s sufficient to other systems to predict based on intuitionistic analogies.
If there’s not a good analogy, Understand the first principles, then try to reason about the equilibria of that.
If that doesn’t work, Assume the world will stay in a stable state, and try to reason from that.
If that doesn’t work, figure out the worst case scenario and plan from there.
I think 1 and 2 are what you do with expertise, and 3 and 4 are what you do without expertise.
Yeah, that sounds about right to me. I think in terms of this framework my claim is primarily “for reasonably complex systems, if you try to do 2 without expertise, you will fail, but you may not realize you have failed”.
I’m also noticing I mean something slightly different by “expertise” than is typically meant. My intended meaning of “expertise” is more like “you have lots of data and observations about the system”, e.g. I think LW self-help stuff is reasonably likely to work (for the LW audience) because people have lots of detailed knowledge and observations about themselves and their friends.