Yup, strongly agree. I focused on the deterministic case because the point is easiest to understand there, but they also apply in the stochastic case.
I suspect people are doing something heuristic and possibly kludgy when we think about someone else gaining power.
I agree, though if I were trying to have a nice formalization, one thing I might do is look at what “power” looks like in a multiagent setting, where you can’t be “larger” than the environment, and so you can’t have perfectly calibrated beliefs about what’s going to happen.
Yup, strongly agree. I focused on the deterministic case because the point is easiest to understand there, but they also apply in the stochastic case.
I agree, though if I were trying to have a nice formalization, one thing I might do is look at what “power” looks like in a multiagent setting, where you can’t be “larger” than the environment, and so you can’t have perfectly calibrated beliefs about what’s going to happen.