(Policymakers) AI systems are very much unlike humans. AI research isn’t trying to replicate the human brain; the goal is, however, to be better than humans at certain tasks. For the AI industry, better means cheaper, faster, more precise, more reliable. A plane flies faster than birds, we don’t care if it needs more fuel. Some properties are important (here, speed), some aren’t (here, consumption).
When developing current AI systems, we’re focusing on speed and precision, and we don’t care about unintended outcomes. This isn’t an issue for most systems: a plane autopilot isn’t making actions a human pilot couldn’t do; a human is always there.
However, this constant supervision is expensive and slow. We’d like our machines to be autonomous and quick. They perform well on the “important” things, so why not give them more power? Except, here, we’re creating powerful, faster machines that will reliably do thing we didn’t have time to think about. We made them to be faster than us, so we won’t have time to react to unintended consequences.
This complacency will lead us to unexpected outcomes. The more powerful the systems, the worse they may be.
(Policymakers) AI systems are very much unlike humans. AI research isn’t trying to replicate the human brain; the goal is, however, to be better than humans at certain tasks. For the AI industry, better means cheaper, faster, more precise, more reliable. A plane flies faster than birds, we don’t care if it needs more fuel. Some properties are important (here, speed), some aren’t (here, consumption).
When developing current AI systems, we’re focusing on speed and precision, and we don’t care about unintended outcomes. This isn’t an issue for most systems: a plane autopilot isn’t making actions a human pilot couldn’t do; a human is always there.
However, this constant supervision is expensive and slow. We’d like our machines to be autonomous and quick. They perform well on the “important” things, so why not give them more power? Except, here, we’re creating powerful, faster machines that will reliably do thing we didn’t have time to think about. We made them to be faster than us, so we won’t have time to react to unintended consequences.
This complacency will lead us to unexpected outcomes. The more powerful the systems, the worse they may be.