It seems to me as a very ill-advised application of concepts related to FDT or anthropics in general?
Like:
Precommitment is wrong. Stripping yourself of options doesn’t do you any good. One of the motivations behind FDT was intention to recreate outperformance of precommited agents inside some specific situations without their underperformance in general.
It isn’t likely? To describe you (in broad sense of you across many branches of Everett’s multiverse) inside simple physical universe we need relatively simple code of physical universe + adress of branches with “you”. To describe you inside simulation you need physics of the universe that contains the simulation + all from above. To describe you as substrate-independent algorithm you need an astounding amount of complexity. So probability that you are in simulation is exponentially small.
(subdivision of previous) If you think of probability that you are simulated by a hostile superintelligence, you need to behave exactly as you would behave without this thought, because act in responce to threat (and being in hell for acting in non-desirable for adversary way is a pure decision-theoretical threat) is a direct invitation to make a threat.
I would like to see the details, maybe in a vague enough form.
So I don’t think that resulting tragedies are outcomes of rigorous application of FDT, but more of consequence of general statement “emotionally powerful concepts (like rationality, consequentialism or singularity) can hurt you if you are already unstable enough and have a memetic immune disorder”.
It seems to me as a very ill-advised application of concepts related to FDT or anthropics in general?
Like:
Precommitment is wrong. Stripping yourself of options doesn’t do you any good. One of the motivations behind FDT was intention to recreate outperformance of precommited agents inside some specific situations without their underperformance in general.
It isn’t likely? To describe you (in broad sense of you across many branches of Everett’s multiverse) inside simple physical universe we need relatively simple code of physical universe + adress of branches with “you”. To describe you inside simulation you need physics of the universe that contains the simulation + all from above. To describe you as substrate-independent algorithm you need an astounding amount of complexity. So probability that you are in simulation is exponentially small.
(subdivision of previous) If you think of probability that you are simulated by a hostile superintelligence, you need to behave exactly as you would behave without this thought, because act in responce to threat (and being in hell for acting in non-desirable for adversary way is a pure decision-theoretical threat) is a direct invitation to make a threat.
I would like to see the details, maybe in a vague enough form.
So I don’t think that resulting tragedies are outcomes of rigorous application of FDT, but more of consequence of general statement “emotionally powerful concepts (like rationality, consequentialism or singularity) can hurt you if you are already unstable enough and have a memetic immune disorder”.