i think “actually most of your situations do not have that much subjunctive dependence” is pretty compelling personally
it’s not so much that most of the espoused decision theory is fundamentally incorrect but rather that subjunctive dependence is an empirical claim about how the world works, can be tested empirically, and seems insufficiently justified to me
however i think the obvious limitation of this kind of approach is that it has no model for ppl behaving incoherent ways except as a strategy for gaslighting ppl about how accountable you are for your actions. this is a real strategy ppl often do but is not the whole of it imo
this is implied by how, as soon as ppl are not oppressing you “strategically”, the game theory around escalation breaks. by doing the Ziz approach, you wind up walking into bullets that were not meant for you, or maybe anyone, and have exerted no power here or counterfactually
For reasons that maybe no normal person really understands, in an outcome that the theory’s inventors found very surprising, some people seem to be insane in a way that seizes very hard on a twisted version of this theory.
In a certain class of altered state,[1] a person’s awareness includes a wider part of their predictive world-model than usual. Rather than perceiving primarily the part of the self model which models themselves looking out into a world model, the normal gating mechanisms come apart and they perceive much more of their world-model directly (including being able to introspect on their brain’s copy of other people more vividly).
This world model includes other agents. Those models of other agents in their world model are now existing in an much less sandboxed environment. It viscerally feels like there is extremely strong entanglement between their actions and those of the agents that might be modelling them, because their model of the other agents is able to read their self-model and vice versa, and in that state they’re kinda running it right on the bare-metal models themselves. Additionally, people’s models of other people generally use themselves as a template. If they’re thinking a lot about threats and blackmail and similar, it’s easy for that to leak into expecting others are modelling this more than they are.
So their systems strongly predict that there is way more subjunctive dependence than is real, due to how the brain handles those kind of emergencies.[2]
Add in the thing where decision theory has counterintuitive suggestions and tries to operate kinda below the normal layer of decision process, plus people not being intuitively familiar with it, and yea, I can see why some people can get to weird places. Not reasonably predictable in advance, it’s a weird pitfall, but in retrospect fits.
Maybe it’s a good idea to write an explainer for this to try and mitigate this way people seem to be able to implode. I might talk to some people.
The schizophrenia/psychosis/psychedelics-like cluster, often caused by being in extreme psychological states like those caused by cults and extreme perceived thread, especially with reckless mind exploration thrown in the mix.
[epistemic status: very speculative] it seems plausible this is in part a feature evolution built for handling situations where you seem to be in extreme danger, taking a large chance of doing quite badly and damaging your epistemics or acting in wildly bad ways in order to try and get some chance of finding a path through whatever put you in that state by running a bunch of unsafe cognitive operations which might hit upon a way out of likely death. it sure seems like the common advice is things like “eat food”, “drink water”, “sleep at all”, “be around people who feel safe”, which feel like the kinds of things that would turn down those alarm bells. though also this could just be an entirely natural consequence of stress on a cognitive system
I could imagine something vaguely sorta like this being true but that isn’t like, something I’d confidently predict is a common sort of altered mental state to fall into, having been in altered states somewhere around that cluster.
I’d suspect that like, maybe there’s a component where they intuitively overestimate the dependence relative to other people, but probably it involves deliberate decisions to try to see things a certain way and stuff like that. (Though actually I have no idea what “strength of subjunctive dependence” really means, I think there are unsolved philosophical problems there.)
IMO Eliezer correctly identifies a crucial thing Ziz got wrong about decision theory:
also:
and:
Oh… huh. @Eliezer Yudkowsky, I think I figured it out.
In a certain class of altered state,[1] a person’s awareness includes a wider part of their predictive world-model than usual. Rather than perceiving primarily the part of the self model which models themselves looking out into a world model, the normal gating mechanisms come apart and they perceive much more of their world-model directly (including being able to introspect on their brain’s copy of other people more vividly).
This world model includes other agents. Those models of other agents in their world model are now existing in an much less sandboxed environment. It viscerally feels like there is extremely strong entanglement between their actions and those of the agents that might be modelling them, because their model of the other agents is able to read their self-model and vice versa, and in that state they’re kinda running it right on the bare-metal models themselves. Additionally, people’s models of other people generally use themselves as a template. If they’re thinking a lot about threats and blackmail and similar, it’s easy for that to leak into expecting others are modelling this more than they are.
So their systems strongly predict that there is way more subjunctive dependence than is real, due to how the brain handles those kind of emergencies.[2]
Add in the thing where decision theory has counterintuitive suggestions and tries to operate kinda below the normal layer of decision process, plus people not being intuitively familiar with it, and yea, I can see why some people can get to weird places. Not reasonably predictable in advance, it’s a weird pitfall, but in retrospect fits.
Maybe it’s a good idea to write an explainer for this to try and mitigate this way people seem to be able to implode. I might talk to some people.
The schizophrenia/psychosis/psychedelics-like cluster, often caused by being in extreme psychological states like those caused by cults and extreme perceived thread, especially with reckless mind exploration thrown in the mix.
[epistemic status: very speculative] it seems plausible this is in part a feature evolution built for handling situations where you seem to be in extreme danger, taking a large chance of doing quite badly and damaging your epistemics or acting in wildly bad ways in order to try and get some chance of finding a path through whatever put you in that state by running a bunch of unsafe cognitive operations which might hit upon a way out of likely death. it sure seems like the common advice is things like “eat food”, “drink water”, “sleep at all”, “be around people who feel safe”, which feel like the kinds of things that would turn down those alarm bells. though also this could just be an entirely natural consequence of stress on a cognitive system
I could imagine something vaguely sorta like this being true but that isn’t like, something I’d confidently predict is a common sort of altered mental state to fall into, having been in altered states somewhere around that cluster.
I’d suspect that like, maybe there’s a component where they intuitively overestimate the dependence relative to other people, but probably it involves deliberate decisions to try to see things a certain way and stuff like that. (Though actually I have no idea what “strength of subjunctive dependence” really means, I think there are unsolved philosophical problems there.)