The point I was making is that you attempt to maximise your utility function. Your utility function decreases when you learn of a bad thing happening
I think you’re still confused. Utility functions operate on outcomes and universes as they actually are, not as you believe them to be. Learning of things doesn’t decrease or increase your utility function. If it did, you could maximize it by taking a pill that made you think it was maximized. Learning that you were wrong about the utility of the universe is not the same as changing the utility of the universe.
Learning of things doesn’t decrease or increase your utility function.
It depends on what is meant by “your utility function”. That’s true if you mean a mathematical idealization of a calculation performed in your nervous system that ignores your knowledge state, but false if you mean the actual result calculated by your nervous system.
I think you’re still confused. Utility functions operate on outcomes and universes as they actually are, not as you believe them to be. Learning of things doesn’t decrease or increase your utility function. If it did, you could maximize it by taking a pill that made you think it was maximized. Learning that you were wrong about the utility of the universe is not the same as changing the utility of the universe.
It depends on what is meant by “your utility function”. That’s true if you mean a mathematical idealization of a calculation performed in your nervous system that ignores your knowledge state, but false if you mean the actual result calculated by your nervous system.