Human values differ a lot based on the surrounding. If you dump all humans into fairy tale land they might react very differently than now.
You seem to assume that the goal structure of a human is stable. But when i look around I see all kinds of manipulations happening. Religion && Politics being one, advertisement being another.
An AI doesn’t have to rewire brains the hard way. It could just buy a major entertainment company and implement the values it prefers humans to have into a soap opera, and then buy some ads at the super-bowl.
Allowing the AI to change human values in the indirect way opens a big can of worms.
Not allowing that does too.
Ads for using seat-belts anyone?
The more I learn about humans the more I doubt there is an inherent internal goal structure (besides the more general once like not killing humans of) Manipulations are way to easy to do.
Human values differ a lot based on the surrounding. If you dump all humans into fairy tale land they might react very differently than now.
You seem to assume that the goal structure of a human is stable. But when i look around I see all kinds of manipulations happening. Religion && Politics being one, advertisement being another. An AI doesn’t have to rewire brains the hard way. It could just buy a major entertainment company and implement the values it prefers humans to have into a soap opera, and then buy some ads at the super-bowl. Allowing the AI to change human values in the indirect way opens a big can of worms. Not allowing that does too. Ads for using seat-belts anyone? The more I learn about humans the more I doubt there is an inherent internal goal structure (besides the more general once like not killing humans of) Manipulations are way to easy to do.