If both of those things happened I would be very interested in hearing about the person who decided to make a paperclip maximizer despite having an explicit model of human utility function they could implement.
Actually, I wouldn’t be interested in anything. I would be paperclips.
It hardly seems to make sense to implement a utility function for a paperclip plant, your AI would be focused on solving death and making people happy instead of making more paperclips!
If both of those things happened I would be very interested in hearing about the person who decided to make a paperclip maximizer despite having an explicit model of human utility function they could implement.
Actually, I wouldn’t be interested in anything. I would be paperclips.
It hardly seems to make sense to implement a utility function for a paperclip plant, your AI would be focused on solving death and making people happy instead of making more paperclips!