When I saw the title of this article, I assumed it would be about the real world—that things which are made for purposes develop characteristics which make them pursue and impede those purposes in unpredictable ways. This includes computer programs which get more complex and independent (at least from the point of view of the users), not to mention governments and businesses and their subsystems.
How do you keep humans from making your tool AI more of an agent because each little bit seems like a good idea at the time?
When I saw the title of this article, I assumed it would be about the real world—that things which are made for purposes develop characteristics which make them pursue and impede those purposes in unpredictable ways. This includes computer programs which get more complex and independent (at least from the point of view of the users), not to mention governments and businesses and their subsystems.
How do you keep humans from making your tool AI more of an agent because each little bit seems like a good idea at the time?