Agreed completely. This goes for any utilitarianism where the worth of changing from state A to state B is f(B)-f(A) . Morality is about transitions; even hedonism is, as happiness is nothing if it is frozen solid.
That’s a good escape but only for specific laws of physics… what do you do about brain sim on computer? It has multiple CPUs going over calculating next state from current state in parallel, and it doesn’t care about how CPU is physically implemented, but it does care how many experience-steps it has. edit: i.e. i mean, transition from one happy state to other happy state that is equally a happy state is what a moment of being happy is about. the total utilitarianism boils down to zero utility of an update pass on a happy brain sim. It’s completely broken. edit: and with simple workarounds, it boils down to zero utility of switching the current/next state arrays, so that you sit in a loop recalculating same next state from static current state.
Agreed completely. This goes for any utilitarianism where the worth of changing from state A to state B is f(B)-f(A) . Morality is about transitions; even hedonism is, as happiness is nothing if it is frozen solid.
I’d take A and B in the equation above to include momentums as well as positions? :-)
That’s a good escape but only for specific laws of physics… what do you do about brain sim on computer? It has multiple CPUs going over calculating next state from current state in parallel, and it doesn’t care about how CPU is physically implemented, but it does care how many experience-steps it has. edit: i.e. i mean, transition from one happy state to other happy state that is equally a happy state is what a moment of being happy is about. the total utilitarianism boils down to zero utility of an update pass on a happy brain sim. It’s completely broken. edit: and with simple workarounds, it boils down to zero utility of switching the current/next state arrays, so that you sit in a loop recalculating same next state from static current state.