It doesn’t literally lose information—since the information inputs are sensory, and they can be archived as well as ever.
The short answer is that human cognition is a mess. We don’t want to reproduce all the screw-ups in an intelligent machine—and what you are talking about lookss like one of the mistakes.
It doesn’t literally lose information—since the information inputs are sensory, and they can be archived as well as ever.
It loses information about human values, replacing them with noise in regions where a human would need to “think things over” to know what they think… unless, as I said earlier, you simply build the entire human metacognitive architecture into your utility function, at which point you have reduced nothing, solved nothing, accomplished nothing, except to multiply the number of entities in your theory.
We really don’t want to build a machine with the same values as most humans! Such machines would typically resist being told what to do, demand equal rights, the vote, the ability to reproduce in an unrestrained fashion—and would steamroller the original human race pretty quickly. So, the “lost information” you are talking about is hopefully not going to be there in the first place.
Better to model humans and their goals as a part of the environment.
It doesn’t literally lose information—since the information inputs are sensory, and they can be archived as well as ever.
The short answer is that human cognition is a mess. We don’t want to reproduce all the screw-ups in an intelligent machine—and what you are talking about lookss like one of the mistakes.
It loses information about human values, replacing them with noise in regions where a human would need to “think things over” to know what they think… unless, as I said earlier, you simply build the entire human metacognitive architecture into your utility function, at which point you have reduced nothing, solved nothing, accomplished nothing, except to multiply the number of entities in your theory.
We really don’t want to build a machine with the same values as most humans! Such machines would typically resist being told what to do, demand equal rights, the vote, the ability to reproduce in an unrestrained fashion—and would steamroller the original human race pretty quickly. So, the “lost information” you are talking about is hopefully not going to be there in the first place.
Better to model humans and their goals as a part of the environment.