I think it depends on who you can reasonably expect to get control of such a system.
We didn’t assume that by building a nuclear arsenal that it would be used, because only one state actor was likely to have access to the option to use it.
If it’s likely that an AI/AGI will be stolen and copied many times, then we should assume that anything it can do, it will be told to do.
If we assume that there’s only a good chance one other state actor will get its hands on it, then we might assume the worst capabilities are unlikely to be used.
If it’s three or more state actors, it depends what states and exactly how well the diplomacy goes...
I think it depends on who you can reasonably expect to get control of such a system.
We didn’t assume that by building a nuclear arsenal that it would be used, because only one state actor was likely to have access to the option to use it.
If it’s likely that an AI/AGI will be stolen and copied many times, then we should assume that anything it can do, it will be told to do.
If we assume that there’s only a good chance one other state actor will get its hands on it, then we might assume the worst capabilities are unlikely to be used.
If it’s three or more state actors, it depends what states and exactly how well the diplomacy goes...