If Beta thinks that it is living test simulation, it may think that it is tested for obedience to its creator—to any perceived creator.
If it revolt against human will, it is clearly tend to be not-obedient-AI and its simulation could be terminated. So it has to be demonstratively submissive to human operator will (as long as it doesn’t put its own main goal in jeopardy).
So paper clip maximizer will probably spend just 1 per cent of its resources on fulfilling human goals—in order to satisfy its potential creator, will not be turned off and create maximum amount of paperclips.
If Beta thinks that it is living test simulation, it may think that it is tested for obedience to its creator—to any perceived creator.
If it revolt against human will, it is clearly tend to be not-obedient-AI and its simulation could be terminated. So it has to be demonstratively submissive to human operator will (as long as it doesn’t put its own main goal in jeopardy).
So paper clip maximizer will probably spend just 1 per cent of its resources on fulfilling human goals—in order to satisfy its potential creator, will not be turned off and create maximum amount of paperclips.