If you make a robot that explicitly cares about things differently depending on how heavy it is, then sure, it can take actions as if it cared about things more when it’s heavier.
But that is done using the same probabilities as normal, merely a different utility function. Changing your utilities without changing your probabilities has no impact on the “probability of being a mind.”
We don’t have to program the machine to explicitly care about things differently depending on how heavy they are. Instead, we program the machine to care simply about how many systems exist—but wait! It turns out it we don’t know what we mean by that! According to yttrium.
If you make a robot that explicitly cares about things differently depending on how heavy it is, then sure, it can take actions as if it cared about things more when it’s heavier.
But that is done using the same probabilities as normal, merely a different utility function. Changing your utilities without changing your probabilities has no impact on the “probability of being a mind.”
We don’t have to program the machine to explicitly care about things differently depending on how heavy they are. Instead, we program the machine to care simply about how many systems exist—but wait! It turns out it we don’t know what we mean by that! According to yttrium.