I will grant any request that doesn’t (1)… (2)… (3)...
It’s better to grant any request that should be granted instead. And since some requests that should be granted, are not asked for, the category of “explicit requests” is also a wrong thing to consider. AI just does what it should, requests or no requests. There seems to be no reason to even make the assumption that there should be “sentient life”, as opposed to more complicated and more valuable stuff that doesn’t factorize as individuals.
Any god will either quickly kill you or be friendly.
The concepts of “not killing” and “friendliness” are distinct, hence there are Not Killing AIs that are not Friendly, and Friendly AIs that kill (if it’s a better alternative to not killing).
It goes downhill from “What happens now?”.
It’s better to grant any request that should be granted instead. And since some requests that should be granted, are not asked for, the category of “explicit requests” is also a wrong thing to consider. AI just does what it should, requests or no requests. There seems to be no reason to even make the assumption that there should be “sentient life”, as opposed to more complicated and more valuable stuff that doesn’t factorize as individuals.
The concepts of “not killing” and “friendliness” are distinct, hence there are Not Killing AIs that are not Friendly, and Friendly AIs that kill (if it’s a better alternative to not killing).
Does this count?