There seems to be controversy about “exist” and “out there”, can you taboo those?
For example, are you saying the you think Ultimate Ensemble does not contain structures that depend on them, or that they lead to an inconsistency somewhere, or simply that your utility function does not speak about things that require them, or what exactly?
or simply that your utility function does not speak about things that require them, or what exactly?
This isn’t a much simple/weaker claim than other possible meanings for “I just have trouble believing”.
Their underlying other utility functions would be contagious. For example, if my utility function requires them, then someone of whom it is accurate to say that “His utility function does not speak about things that require them” would’t be able to include in his utility function my desires, or desires of those who cared about my desires, or desires of people who cared about the desires of people who cared about my desires, and so forth.
Eliezer cares about some people, some people care about me, and the rest is six degrees of Kevin Bacon.
The most extreme similar interpretation would have to be a statement about human utility functions in general.
I don’t know what it means to care about the existence of the smallest uncountable ordinal (as opposed to caring that this existence can be proved in ZF, or cannot be refuted in second-order arithmetic, or something like that). Can we taboo “smallest uncountable ordinal” here?
In real life, I’ve had some trouble recently admitting I hadn’t thought of something when it was plausible to claim I had. I think that admitting it would/will cost me status points, as it does not involve rationalists, “rationalists”, aspiring rationalists, or “aspiring rationalists”.
Are you sure you chose the phrase “simply that your utility function does not speak about things that require them” to describe the state of affairs where no human utility function would have it, and hence it would be unimportant to Eliezer?
If you see the thought expressed in my comment as trivially obvious, then:
1) we disagree about what people would find obvious,
2) regardless of the truth of what people find obvious, you are probably smarter than I to make that assumption, rather than simply less good at modeling other humans’ understanding,
3) I’m glad to be told by someone smarter than I that my thoughts are trivial, rather than wrong.
The comment wasn’t really intended for anyone other than Eliezer, and I forgot to correct for the halo making him out to me basically omniscience and capable of reading my mind.
he could still accept superhappies or papperclipers caring about it say.
I think he actually might intrinsically value their desires too. One can theoretically make the transition from “human” to “paperclip maximizer” one atom at a time; differences in kind are the best way for corrupted/insufficiently powerful software to think about it, but here we’re talking about logical impurity, which would contaminate with sub-homeopathic doses.
Well, in that case it’s new information and we can conclude that either his utility function DOES include things in those universes that he claim can not exist, or it’s not physically possible to construct an agent that would care about them.
There seems to be controversy about “exist” and “out there”, can you taboo those?
For example, are you saying the you think Ultimate Ensemble does not contain structures that depend on them, or that they lead to an inconsistency somewhere, or simply that your utility function does not speak about things that require them, or what exactly?
This isn’t a much simple/weaker claim than other possible meanings for “I just have trouble believing”.
Their underlying other utility functions would be contagious. For example, if my utility function requires them, then someone of whom it is accurate to say that “His utility function does not speak about things that require them” would’t be able to include in his utility function my desires, or desires of those who cared about my desires, or desires of people who cared about the desires of people who cared about my desires, and so forth.
Eliezer cares about some people, some people care about me, and the rest is six degrees of Kevin Bacon.
The most extreme similar interpretation would have to be a statement about human utility functions in general.
I don’t know what it means to care about the existence of the smallest uncountable ordinal (as opposed to caring that this existence can be proved in ZF, or cannot be refuted in second-order arithmetic, or something like that). Can we taboo “smallest uncountable ordinal” here?
well, yea, presumably it implies he believes all humans have that trait, but he could still accept superhappies or papperclipers caring about it say.
In real life, I’ve had some trouble recently admitting I hadn’t thought of something when it was plausible to claim I had. I think that admitting it would/will cost me status points, as it does not involve rationalists, “rationalists”, aspiring rationalists, or “aspiring rationalists”.
Are you sure you chose the phrase “simply that your utility function does not speak about things that require them” to describe the state of affairs where no human utility function would have it, and hence it would be unimportant to Eliezer?
If you see the thought expressed in my comment as trivially obvious, then:
1) we disagree about what people would find obvious, 2) regardless of the truth of what people find obvious, you are probably smarter than I to make that assumption, rather than simply less good at modeling other humans’ understanding, 3) I’m glad to be told by someone smarter than I that my thoughts are trivial, rather than wrong.
The comment wasn’t really intended for anyone other than Eliezer, and I forgot to correct for the halo making him out to me basically omniscience and capable of reading my mind.
I think he actually might intrinsically value their desires too. One can theoretically make the transition from “human” to “paperclip maximizer” one atom at a time; differences in kind are the best way for corrupted/insufficiently powerful software to think about it, but here we’re talking about logical impurity, which would contaminate with sub-homeopathic doses.
Well, in that case it’s new information and we can conclude that either his utility function DOES include things in those universes that he claim can not exist, or it’s not physically possible to construct an agent that would care about them.
I would say “care dependent upon them”. An agent could care dependent upon them without caring about them, the converse is not true.
That’s even wider, although probably by a very small amount, thanks!