The comment wasn’t really intended for anyone other than Eliezer, and I forgot to correct for the halo making him out to me basically omniscience and capable of reading my mind.
he could still accept superhappies or papperclipers caring about it say.
I think he actually might intrinsically value their desires too. One can theoretically make the transition from “human” to “paperclip maximizer” one atom at a time; differences in kind are the best way for corrupted/insufficiently powerful software to think about it, but here we’re talking about logical impurity, which would contaminate with sub-homeopathic doses.
Well, in that case it’s new information and we can conclude that either his utility function DOES include things in those universes that he claim can not exist, or it’s not physically possible to construct an agent that would care about them.
The comment wasn’t really intended for anyone other than Eliezer, and I forgot to correct for the halo making him out to me basically omniscience and capable of reading my mind.
I think he actually might intrinsically value their desires too. One can theoretically make the transition from “human” to “paperclip maximizer” one atom at a time; differences in kind are the best way for corrupted/insufficiently powerful software to think about it, but here we’re talking about logical impurity, which would contaminate with sub-homeopathic doses.
Well, in that case it’s new information and we can conclude that either his utility function DOES include things in those universes that he claim can not exist, or it’s not physically possible to construct an agent that would care about them.
I would say “care dependent upon them”. An agent could care dependent upon them without caring about them, the converse is not true.
That’s even wider, although probably by a very small amount, thanks!