Omega offers you two boxes, each box contains a statement, upon choosing a box you will instantly belive that statement: One contains somthing true which you currently belive to be false, tailored to cause maximum disutility in your preferred ethical system; the other contains something false which you currently belive to be true, tailored to cause maximum utility.
Truth with negative consequences or Falsehood with positive ones? If you value nothing over truth you will realise something terrible upon opening the first box, that will maybe make you kill your family. If you value something other than truth, you will end up believing that the programming code you are writing will make pie, when it will in fact make a FAI.
Good epistemological rationality requires avoidance of bias, contradiction, arbitrariness, etc. That is just what my rationality-based ethics needs.
I will defer to the problem of
Truth with negative consequences or Falsehood with positive ones? If you value nothing over truth you will realise something terrible upon opening the first box, that will maybe make you kill your family. If you value something other than truth, you will end up believing that the programming code you are writing will make pie, when it will in fact make a FAI.