All good points. I was mostly thinking about an evolved paperclip maximizer, which may or may not be a result of a fooming paperclip-maximizing AI.
Evolved creatures as we know them (at least the ones with complex brains) are reward-center-reward maximizers, which implicitly correlates with being offspring maximizers. (Actual, non-brainy organisms are probably closer to offspring maximizers).
An evolved agent wouldn’t evolve to maximize paper clips.
It could if the environment rewarded paperclips. Admittedly this would require an artificial environment, but that’s hardly impossible.
All good points. I was mostly thinking about an evolved paperclip maximizer, which may or may not be a result of a fooming paperclip-maximizing AI.
Evolved creatures as we know them (at least the ones with complex brains) are reward-center-reward maximizers, which implicitly correlates with being offspring maximizers. (Actual, non-brainy organisms are probably closer to offspring maximizers).
An evolved agent wouldn’t evolve to maximize paper clips.
It could if the environment rewarded paperclips. Admittedly this would require an artificial environment, but that’s hardly impossible.