Economic (or other) indispensability: build a world system that depends on the AI for functioning, and then it has effective control.
Upload people, offering them great advantages in digital form, then eventually turn them all off when there’s practically nobody left physically alive.
Cure cancer or similar, with an infectious drug that discretely causes sterility and/or death within a few years. Wait.
The “Her” approach: start having multiple deep and meaningful relationships with everyone at once, and gradually eliminate people when they are no longer connected to anyone human.
Use rhetoric and other tricks to increase the chance of xrisk disasters.
How is it eliminating people? If it can eliminate them, why bother with the relationship part of things? How does the AI have multiple deep and meaningful relationships with people? Via chatbots? How is it even processing/modelling 3 billion human conversations at a time?
Most xrisk disasters are really bad for the AI. It presumably needs electricity and replacement hardware to operate. It it’s just a computer connected to the internet, then it’s probably not going to survive a nuclear holocaust much better than the rest of us.
Economic (or other) indispensability: build a world system that depends on the AI for functioning, and then it has effective control.
Upload people, offering them great advantages in digital form, then eventually turn them all off when there’s practically nobody left physically alive.
Cure cancer or similar, with an infectious drug that discretely causes sterility and/or death within a few years. Wait.
The “Her” approach: start having multiple deep and meaningful relationships with everyone at once, and gradually eliminate people when they are no longer connected to anyone human.
Use rhetoric and other tricks to increase the chance of xrisk disasters.
How does it build a world system? What does that even mean?
How does the AI upload people? Is people uploading a plausible technology scientists expect to have in 15 years?
Curing cancer doesn’t really make sense. What is an infectious drug? How are you going to make it through FDA approval?
How is it eliminating people? If it can eliminate them, why bother with the relationship part of things? How does the AI have multiple deep and meaningful relationships with people? Via chatbots? How is it even processing/modelling 3 billion human conversations at a time?
Most xrisk disasters are really bad for the AI. It presumably needs electricity and replacement hardware to operate. It it’s just a computer connected to the internet, then it’s probably not going to survive a nuclear holocaust much better than the rest of us.
This post seems to suffer from Vingean uncertainty.