All these depictions about multi-tiered stages through which action & incentives have to pass—like venture capital going thru multiple rounds A → B → C, or the recursion of “it’s more than just what you think, but also what you think others think”, and “what will the journalists say about what the politicians say outside the overton window” etc. seem to me to have a similar theme underlying them all, that ‘the further distance you are away, the more screwed up and misaligned the effects are’, and the more we’re locked-in to weird states of inadequate equilibria. And this reminds me of an analogy in Physics. (And I HOPE this is just a casual analogy, and not indicative of some deeper ‘spooky-inadequacy-at-a-distance’ that suggests we’re screwed in this matter all the way down to physics itself!) Like a chair, when you drag it across the kitchen floor and the legs vibrate with jerkiness against the floor and make a screech sound. Or how we have earthquakes—the plates are moving continuously, but the earthquakes occur in jerks, building up pressure to finally jump over a threshold. Or like thermal heating when your heater is heating up and it ticks loudly as the metal expands in jerks. All of this is underlied by the one physics phenomenon that static friction happens to be greater than kinetic friction. So even when you move that chair perfectly continuously and gradually at the top, the further distance out along a chain of steps away (in this case atoms), the more things get jerky at the endpoint. Kind of like the gradual winds of change of public opinion only getting realized in jerks, and revolutions, and rapid phase-transitions over a threshold. Anyway, that’s the thought I couldn’t shake while reading this chapter. I cling to hope that some of these meta-level dynamics to our problems depicted by EY here are remediable somehow… and we’re not predeterminedly screwed on a physical level, unable to confront them. Clearly we need to kick the shit out of these inadequacies! Maybe a rough rule of thumb could be to get up close and personal, right up next to them, not out at a distance of 3 recursive middle-men away (IF even possible to change at all) to lower entry-barriers and have a better chance of the ‘just-snap-out-of-it’? Proposal: 1. Get a Friendly Aligned-AI in power, 2. Go thru the whole tree-of-fruits minimizing the inadequacy-distance away systematically, and 3. If you got past step 1 and are both still alive and also not a paperclip, you’re in the clear, by Anthropic Principle. (Also don’t worry about step 2). So, uh.. just step 1 then.
Spooky-Inadequacy-at-a-Distance ?
All these depictions about multi-tiered stages through which action & incentives have to pass—like venture capital going thru multiple rounds A → B → C, or the recursion of “it’s more than just what you think, but also what you think others think”, and “what will the journalists say about what the politicians say outside the overton window” etc. seem to me to have a similar theme underlying them all, that ‘the further distance you are away, the more screwed up and misaligned the effects are’, and the more we’re locked-in to weird states of inadequate equilibria. And this reminds me of an analogy in Physics. (And I HOPE this is just a casual analogy, and not indicative of some deeper ‘spooky-inadequacy-at-a-distance’ that suggests we’re screwed in this matter all the way down to physics itself!) Like a chair, when you drag it across the kitchen floor and the legs vibrate with jerkiness against the floor and make a screech sound. Or how we have earthquakes—the plates are moving continuously, but the earthquakes occur in jerks, building up pressure to finally jump over a threshold. Or like thermal heating when your heater is heating up and it ticks loudly as the metal expands in jerks. All of this is underlied by the one physics phenomenon that static friction happens to be greater than kinetic friction. So even when you move that chair perfectly continuously and gradually at the top, the further distance out along a chain of steps away (in this case atoms), the more things get jerky at the endpoint. Kind of like the gradual winds of change of public opinion only getting realized in jerks, and revolutions, and rapid phase-transitions over a threshold. Anyway, that’s the thought I couldn’t shake while reading this chapter. I cling to hope that some of these meta-level dynamics to our problems depicted by EY here are remediable somehow… and we’re not predeterminedly screwed on a physical level, unable to confront them. Clearly we need to kick the shit out of these inadequacies! Maybe a rough rule of thumb could be to get up close and personal, right up next to them, not out at a distance of 3 recursive middle-men away (IF even possible to change at all) to lower entry-barriers and have a better chance of the ‘just-snap-out-of-it’? Proposal: 1. Get a Friendly Aligned-AI in power, 2. Go thru the whole tree-of-fruits minimizing the inadequacy-distance away systematically, and 3. If you got past step 1 and are both still alive and also not a paperclip, you’re in the clear, by Anthropic Principle. (Also don’t worry about step 2). So, uh.. just step 1 then.