Michael: could you elaborate on “Also, I have long asserted that experience associated with static configurations seems to me to be close to isomorphic to experience associated with multiple instantiations of a computation.”?
Ricky: I don’t think that assumption is being made; rather, you have to transform causal hypotheses with intramoment dependencies into ones without (this seems like it should always be possible).
Eliezer: this may indicate I missed the point of that section, but you can generate a high→low entropy history by computing a low→high entropy history and reversing the frames. It looks to me like Bayesian causality naturally accompanies increase in entropy, since (very handwavingly, this is hard for me to verbalize) P(M2|R1,R2) ≠ P(M2|M1,R1,R2) is more likely to hold if R has higher entropy than M.
(Is there a different standard term?)
Michael: could you elaborate on “Also, I have long asserted that experience associated with static configurations seems to me to be close to isomorphic to experience associated with multiple instantiations of a computation.”?
Ricky: I don’t think that assumption is being made; rather, you have to transform causal hypotheses with intramoment dependencies into ones without (this seems like it should always be possible).
Eliezer: this may indicate I missed the point of that section, but you can generate a high→low entropy history by computing a low→high entropy history and reversing the frames. It looks to me like Bayesian causality naturally accompanies increase in entropy, since (very handwavingly, this is hard for me to verbalize) P(M2|R1,R2) ≠ P(M2|M1,R1,R2) is more likely to hold if R has higher entropy than M. (Is there a different standard term?)