For almost any objective an AI had, it could better accomplish it the more free energy the AI had. The AI would likely go after entropy losses from both stars and people. The AI couldn’t afford to wait to kill people until after it had dealt with nearby stars because by then humans would have likely created another AI god.
For almost any objective an AI had, it could better accomplish it the more free energy the AI had. The AI would likely go after entropy losses from both stars and people. The AI couldn’t afford to wait to kill people until after it had dealt with nearby stars because by then humans would have likely created another AI god.
Assuming that by “AI” you mean something that maximizes a utility function, as opposed to a dumb apocalypse like a grey-goo or energy virus scenario.
I can see how a “dumb apocalypse like a grey-goo or energy virus” would be Artificial, but why would you call it Inteligent?
On this site, unless otherwise specified, AI usually means “at least as smart as a very smart human”.
Yeah, that makes sense. I was going to suggest “smart enough to kill us”, but that’s a pretty low bar.