The utilitarian calculus is an idea that what people value, is described simply in terms of summation , ha. The complexity is another kind of f(a,b,c,d) that behaves vaguely like a ‘sum’ , but is not as hopelessly simple (and stupid) as summation. If the a,b,c,d are strings, and it is a programming language, the above expression is often written like f(a+b+c+d) while it is something very fundamentally different from summation of real valued numbers.
Go downvote everything on utilities summation, please, because it is much more simple than what I propose. It seems to me that we also vaguely describe our complexity-like metric of A,B as ‘sum’ of A and B
The trouble is not the simplicity (appropriately enough). The trouble is that complexity is not, not even a little bit, a general basis for what humans value.
The trouble is that complexity is not, not even a little bit, a general basis for what humans value.
Let’s just go around making assertions and plus-ing the assertions that agree with EY’s assertions.
How is complexity of concatenated strings inherently worse model than sum of real numbers? Where does it fail? It seems to me it describes what I think is right [better than summation does] - e.g. I don’t think that if we make a duplicate of you, and synchronize duplicated brains every 20 seconds, we should give you both twice the candy or if there’s 10 such duplicates, cut up anyone for transplants into them.
The EY’s post that you linked, should be applied to every other notion of morality including the utilitarian summing. Which is strictly a dumber approach than concatenation followed by complexity metric. edit: and that’s because there is complexity metric that just looks at length of the string, the dumbest one, in which case its identical to summation.
edit: also i’m becoming convinced the EY would make an utterly terrible engineer or scientist. The engineering and science works by—wishful thinking of course—assuming that there is a simple process X that coincides with complex process Y well enough, because all you can infer from available data—given the noise and limited number of observations—is a simple process X , and the alternative is to just sit doing nothing twiddling thumbs because if there is no simple approximation you won’t be able to figure out the complex one, or build anything. The engineering requires ordering search for solutions by the probability of success times inverse difficulty, and this favours simple hypotheses. It may seem like a best guess that there is no unifying principle, when you are just guessing; when you are truing to build something, that is the very worst guess.
The utilitarian calculus is an idea that what people value, is described simply in terms of summation , ha. The complexity is another kind of f(a,b,c,d) that behaves vaguely like a ‘sum’ , but is not as hopelessly simple (and stupid) as summation. If the a,b,c,d are strings, and it is a programming language, the above expression is often written like f(a+b+c+d) while it is something very fundamentally different from summation of real valued numbers.
Go downvote everything on utilities summation, please, because it is much more simple than what I propose. It seems to me that we also vaguely describe our complexity-like metric of A,B as ‘sum’ of A and B
The trouble is not the simplicity (appropriately enough). The trouble is that complexity is not, not even a little bit, a general basis for what humans value.
Let’s just go around making assertions and plus-ing the assertions that agree with EY’s assertions.
How is complexity of concatenated strings inherently worse model than sum of real numbers? Where does it fail? It seems to me it describes what I think is right [better than summation does] - e.g. I don’t think that if we make a duplicate of you, and synchronize duplicated brains every 20 seconds, we should give you both twice the candy or if there’s 10 such duplicates, cut up anyone for transplants into them.
The EY’s post that you linked, should be applied to every other notion of morality including the utilitarian summing. Which is strictly a dumber approach than concatenation followed by complexity metric. edit: and that’s because there is complexity metric that just looks at length of the string, the dumbest one, in which case its identical to summation.
edit: also i’m becoming convinced the EY would make an utterly terrible engineer or scientist. The engineering and science works by—wishful thinking of course—assuming that there is a simple process X that coincides with complex process Y well enough, because all you can infer from available data—given the noise and limited number of observations—is a simple process X , and the alternative is to just sit doing nothing twiddling thumbs because if there is no simple approximation you won’t be able to figure out the complex one, or build anything. The engineering requires ordering search for solutions by the probability of success times inverse difficulty, and this favours simple hypotheses. It may seem like a best guess that there is no unifying principle, when you are just guessing; when you are truing to build something, that is the very worst guess.