@Jotaf: No, you misunderstood—guess I got double-transparent-deluded. I’m saying this:
Probability is subjectively objective
Probability is about something external and real (called truth)
Therefore you can take a belief and call it “true” or “false” without comparing it to another belief
If you don’t match truth well enough (if your beliefs are too wrong), you die
So if you’re still alive, you’re not too stupid—you were born with a smart prior, so justified in having it
So I’m happy with probability being subjectively objective, and I don’t want to change my beliefs about the lottery. If the paperclipper had stupid beliefs, it would be dead—but it doesn’t, it has evil morals.
Morality is subjectively objective
Morality is about some abstract object, a computation that exists in Formalia but nowhere in the actual universe
Therefore, if you take a morality, you need another morality (possibly the same one) to assess it, rather than a nonmoral object
Even if there was some light in the sky you could test morality against, it wouldn’t kill you for your morality being evil
So I don’t feel on better moral ground than the paperclipper. It has human_evil morals, but I have paperclipper_evil morals—we are exactly equally horrified.
@Jotaf: No, you misunderstood—guess I got double-transparent-deluded. I’m saying this:
Probability is subjectively objective
Probability is about something external and real (called truth)
Therefore you can take a belief and call it “true” or “false” without comparing it to another belief
If you don’t match truth well enough (if your beliefs are too wrong), you die
So if you’re still alive, you’re not too stupid—you were born with a smart prior, so justified in having it
So I’m happy with probability being subjectively objective, and I don’t want to change my beliefs about the lottery. If the paperclipper had stupid beliefs, it would be dead—but it doesn’t, it has evil morals.
Morality is subjectively objective
Morality is about some abstract object, a computation that exists in Formalia but nowhere in the actual universe
Therefore, if you take a morality, you need another morality (possibly the same one) to assess it, rather than a nonmoral object
Even if there was some light in the sky you could test morality against, it wouldn’t kill you for your morality being evil
So I don’t feel on better moral ground than the paperclipper. It has human_evil morals, but I have paperclipper_evil morals—we are exactly equally horrified.