Things that cannot be measured can still be very important, especially in regard to ethics. One may claim for example that it is ok to torture philosophical zombies, since after all they aren’t “really” experiencing any pain. If it could be shown that I’m the only conscious person in this world and everybody else are p-zombies, then I could morally kill and torture people for my own pleasure.
For there to be a possibility that this “could be shown”, even in principle, there would have to be some kind of measurable difference between a p-zombie and a “conscious” entity.
In worlds where it is impossible to measure a difference in principle, it shouldn’t have any impact on what’s the correct action to take, for any sane utility function.
My ethics are by necessity limited to valuing things whose presence/absence I have some way to measure, at least in principle. If they weren’t, I’d have to worry about epiphenomenal pink unicorns all the time.
Does your brain assume/think it creates sensory experiences (or what people often call consciousness)?
It thinks that it receives data from its environment and processes it, maintaining a (somewhat crude) model of that environment, to create output that manipulates the environment in a predictable manner.
It doesn’t think that there’s any non-measurable consequences of that data processing (once again: that’d be dead code in the model).
If that doesn’t answer your query, please state it more clearly; specifically rationalist-taboo the word “experience”.
Things that cannot be measured can still be very important, especially in regard to ethics. One may claim for example that it is ok to torture philosophical zombies, since after all they aren’t “really” experiencing any pain. If it could be shown that I’m the only conscious person in this world and everybody else are p-zombies, then I could morally kill and torture people for my own pleasure. For there to be a possibility that this “could be shown”, even in principle, there would have to be some kind of measurable difference between a p-zombie and a “conscious” entity. In worlds where it is impossible to measure a difference in principle, it shouldn’t have any impact on what’s the correct action to take, for any sane utility function. My ethics are by necessity limited to valuing things whose presence/absence I have some way to measure, at least in principle. If they weren’t, I’d have to worry about epiphenomenal pink unicorns all the time.
Does your brain assume/think it creates sensory experiences (or what people often call consciousness)? It thinks that it receives data from its environment and processes it, maintaining a (somewhat crude) model of that environment, to create output that manipulates the environment in a predictable manner. It doesn’t think that there’s any non-measurable consequences of that data processing (once again: that’d be dead code in the model). If that doesn’t answer your query, please state it more clearly; specifically rationalist-taboo the word “experience”.