But since I am running on corrupted hardware, I can’t occupy the epistemic state you want me to imagine.
It occurs to me that many (maybe even most) hypotheticals require you to accept an unreasonable epistemic state. Even something so simple as trusting that Omega is telling the truth [and that his “fair coin” was a quantum random number generator rather than, say, a metal disc that he flipped with a deterministic amount of force, but that’s easier to grant as simple sloppy wording]
It occurs to me that many (maybe even most) hypotheticals require you to accept an unreasonable epistemic state. Even something so simple as trusting that Omega is telling the truth [and that his “fair coin” was a quantum random number generator rather than, say, a metal disc that he flipped with a deterministic amount of force, but that’s easier to grant as simple sloppy wording]
In general, thought experiments that depend on an achievable epistemic state can actually be performed and don’t need to remain thought experiments.
They can depend on an achievable epistemic state, but be horribly impractical or immoral to set up (hello trolley problems).