The flaw I’d point out is that Clippy’s utility function is utility = number of paperclips in the world not utility = number of paperclips Clippy thinks are in the world.
Learning about the creation or destruction of paperclips does not actually increase or decrease the number of paperclips in the world.
I agree. That’s the confusion that I was sorting out for myself.
The point I was making is that you attempt to maximise your utility function. Your utility function decreases when you learn of a bad thing happening. My point was that if you don’t know, your utility function is the weighted average of the various things you could think could happen, so on average learning the state of the universe does not change your utility in that manner.
The point I was making is that you attempt to maximise your utility function. Your utility function decreases when you learn of a bad thing happening
I think you’re still confused. Utility functions operate on outcomes and universes as they actually are, not as you believe them to be. Learning of things doesn’t decrease or increase your utility function. If it did, you could maximize it by taking a pill that made you think it was maximized. Learning that you were wrong about the utility of the universe is not the same as changing the utility of the universe.
Learning of things doesn’t decrease or increase your utility function.
It depends on what is meant by “your utility function”. That’s true if you mean a mathematical idealization of a calculation performed in your nervous system that ignores your knowledge state, but false if you mean the actual result calculated by your nervous system.
The flaw I’d point out is that Clippy’s utility function is utility = number of paperclips in the world not utility = number of paperclips Clippy thinks are in the world.
Learning about the creation or destruction of paperclips does not actually increase or decrease the number of paperclips in the world.
I agree. That’s the confusion that I was sorting out for myself.
The point I was making is that you attempt to maximise your utility function. Your utility function decreases when you learn of a bad thing happening. My point was that if you don’t know, your utility function is the weighted average of the various things you could think could happen, so on average learning the state of the universe does not change your utility in that manner.
I think you’re still confused. Utility functions operate on outcomes and universes as they actually are, not as you believe them to be. Learning of things doesn’t decrease or increase your utility function. If it did, you could maximize it by taking a pill that made you think it was maximized. Learning that you were wrong about the utility of the universe is not the same as changing the utility of the universe.
It depends on what is meant by “your utility function”. That’s true if you mean a mathematical idealization of a calculation performed in your nervous system that ignores your knowledge state, but false if you mean the actual result calculated by your nervous system.