AIs deviate from their intended programming, in ways that are dangerous for humans. And it’s not thousands of years away, it’s away just as much as a self-driving car crashing into a group of people to avoid a dog crossing the street.
But that’s a very different kind of issue than AI taking over the world and killing or enslaving all humans.
EDIT:
To expand: all technologies introduce safety issues. Once we got fire some people got burnt. This doesn’t imply that UFFire (Unfriendly Fire) is the most pressing existential risk for humanity and we must devote huge amount of resources to prevent it and never use fire until we have proved that it will not turn “unfriendly”.
Well, there’s a phoenomenon called “flash over”, that realizes in a confined environment, and happens when the temperature of a fire becomes so high that all the substances within starts to burn and feed the reaction.
Now, imagine that the whole world could become a closed environment for the flashover...
However, UFFire does not uncontrollably exponentially reproduce or improve its functioning. Certainly a conflagration on a planet covered entirely by dry forest would be an unmitigatable problem rather quickly.
In fact, in such a scenario, we should dedicate a huge amount of resources to prevent it and never use fire until we have proved it will not turn “unfriendly”.
However, UFFire does not uncontrollably exponentially reproduce or improve its functioning. Certainly a conflagration on a planet covered entirely by dry forest would be an unmitigatable problem rather quickly.
Do you realize this is a totally hypothetical scenario?
But that’s a very different kind of issue than AI taking over the world and killing or enslaving all humans.
EDIT:
To expand: all technologies introduce safety issues.
Once we got fire some people got burnt. This doesn’t imply that UFFire (Unfriendly Fire) is the most pressing existential risk for humanity and we must devote huge amount of resources to prevent it and never use fire until we have proved that it will not turn “unfriendly”.
Well, there’s a phoenomenon called “flash over”, that realizes in a confined environment, and happens when the temperature of a fire becomes so high that all the substances within starts to burn and feed the reaction.
Now, imagine that the whole world could become a closed environment for the flashover...
So we should stop using fire until we prove that the world will not burst into flames?
However, UFFire does not uncontrollably exponentially reproduce or improve its functioning. Certainly a conflagration on a planet covered entirely by dry forest would be an unmitigatable problem rather quickly.
In fact, in such a scenario, we should dedicate a huge amount of resources to prevent it and never use fire until we have proved it will not turn “unfriendly”.
Do you realize this is a totally hypothetical scenario?