Luke, while I agree with the premise, I think that the bogie man of machines taking over may be either inevitable or impossible, depending on where you put your assumptions.
In many ways, machines have BEEN smarter and stronger than humans already. Machine AI may make individual or groups of machines formidable, but until they can reason, replicate, and trust or deceive, I’m not sure they have much of a chance.
Computers can deceive, they just need to programmed to (which is not hard).
(I remember reading an article about computers strategically lying (or something similar) a while ago, but unfortunately I can’t find it again)
(Although, it’s very possible that a computer with sufficient reasoning power would just exhibit “trust” and “deception” (and self-replicate), because they enabled it to achieve its goals more efficiently.)
Luke, while I agree with the premise, I think that the bogie man of machines taking over may be either inevitable or impossible, depending on where you put your assumptions.
In many ways, machines have BEEN smarter and stronger than humans already. Machine AI may make individual or groups of machines formidable, but until they can reason, replicate, and trust or deceive, I’m not sure they have much of a chance.
Trust is one of the top four strengths they’re missing?
What does “to reason” mean?
Getting there.
Again, define “to trust”.
Computers can deceive, they just need to programmed to (which is not hard). (I remember reading an article about computers strategically lying (or something similar) a while ago, but unfortunately I can’t find it again)
(Although, it’s very possible that a computer with sufficient reasoning power would just exhibit “trust” and “deception” (and self-replicate), because they enabled it to achieve its goals more efficiently.)