Environment means both hardware and people—anything the AI has a chance to influence. We could use a narrower definition, but why should the AI respect it? By limiting our map we don’t limit the territory.
When the AI gets much smarter than humans, we may not understand the output of our inspection tools. They will give us huge amounts of data, and we will be unable to decipher what it all means.
Imagine a group of monkeys trying to enslave a human in a cave. Monkeys bring some objects from the jungle to the human and make him produce better food and toys for them (we want the AI to do some real-life optimization, otherwise it’s just money wasted on academic exercises). Monkeys understand that human getting closer to the entrance is trying to escape, and will threaten to kill him if he tries. But they don’t see the danger of human quietly sitting at the back of the cave, constructing a machine gun from the spare parts.
Environment means both hardware and people—anything the AI has a chance to influence. We could use a narrower definition, but why should the AI respect it? By limiting our map we don’t limit the territory.
When the AI gets much smarter than humans, we may not understand the output of our inspection tools. They will give us huge amounts of data, and we will be unable to decipher what it all means.
Imagine a group of monkeys trying to enslave a human in a cave. Monkeys bring some objects from the jungle to the human and make him produce better food and toys for them (we want the AI to do some real-life optimization, otherwise it’s just money wasted on academic exercises). Monkeys understand that human getting closer to the entrance is trying to escape, and will threaten to kill him if he tries. But they don’t see the danger of human quietly sitting at the back of the cave, constructing a machine gun from the spare parts.