realise the difference between measuring something and the thing measured
What does this cash out to, concretely, in terms of a system’s behavior? If I were to put a system in front of you that does “realize the difference between measuring something and the thing measured”, what would that system’s behavior look like? And once you’ve answered that, can you describe what mechanic in the system’s design would lead to that (aspect of its) behavior?
I think the AI systems in this story have a clear understanding of the the difference between the measurement and the thing itself.
Are humans similarly like drug addicts, because we’d prefer experience play and love and friendship and so on even though we understand those things are mediocre approximations to “how many descendants we have”?
These stories always assume that an AI would be dumb enough to not realise the difference between measuring something and the thing measured.
Every AGI is a drug addict, unaware that it’s high is a false one.
Why? Just for drama?
What does this cash out to, concretely, in terms of a system’s behavior? If I were to put a system in front of you that does “realize the difference between measuring something and the thing measured”, what would that system’s behavior look like? And once you’ve answered that, can you describe what mechanic in the system’s design would lead to that (aspect of its) behavior?
I think the AI systems in this story have a clear understanding of the the difference between the measurement and the thing itself.
Are humans similarly like drug addicts, because we’d prefer experience play and love and friendship and so on even though we understand those things are mediocre approximations to “how many descendants we have”?