Can we solve this problem by slightly modifying the hypothetical to say that Omega is computing your utility function perfectly in every respect except for whatever extent you care about truth for its own sake? Depending on exactly how we define Omega’s capabilities and the concept of utility, there probably is a sense in which the answer really is determined by definition (or in which the example is impossible to construct). But I took the spirit of the question to be “you are effectively guaranteed to get a massively huge dose of utility/disutility in basically every respect, but it’s the product of believing a false/true statement—what say you?”
Can we solve this problem by slightly modifying the hypothetical to say that Omega is computing your utility function perfectly in every respect except for whatever extent you care about truth for its own sake? Depending on exactly how we define Omega’s capabilities and the concept of utility, there probably is a sense in which the answer really is determined by definition (or in which the example is impossible to construct). But I took the spirit of the question to be “you are effectively guaranteed to get a massively huge dose of utility/disutility in basically every respect, but it’s the product of believing a false/true statement—what say you?”