Why do you think a piece of software has the same goals as it’s creator? My conscious planning mind doesn’t have the same goals as evolution.
Current software doesn’t even have goals, it has behaviors. Ascribing desires and decision-making to it leads to incorrect beliefs. AIs will have goals, but they’ll be influenced and shaped by their creators rather than being fully specified.
Why do you think a piece of software has the same goals as it’s creator? My conscious planning mind doesn’t have the same goals as evolution.
Current software doesn’t even have goals, it has behaviors. Ascribing desires and decision-making to it leads to incorrect beliefs. AIs will have goals, but they’ll be influenced and shaped by their creators rather than being fully specified.