It is imaginable — I wouldn’t say likely — that there are “universal moral laws” for human beings, which take the following form: “If you come to the conclusion ‘Utility is maximized if I murder these innocent people’, then it is more likely that your human brain has glitched and failed to reason correctly, than that your conclusion is correct.” In other words, the probability of a positive-utility outcome from murder is less than the probability of erroneous reasoning leading to the belief in that outcome.
Obligatory link to relevant sequence.