Somehow the word “predicted” in the title (as opposed to, say, “future” or “planned”) led me to expect entries for things like “OpenAI releases explicit model of human utility function” and “Entire mass of planet earth converted to paperclips”...
If both of those things happened I would be very interested in hearing about the person who decided to make a paperclip maximizer despite having an explicit model of human utility function they could implement.
Actually, I wouldn’t be interested in anything. I would be paperclips.
It hardly seems to make sense to implement a utility function for a paperclip plant, your AI would be focused on solving death and making people happy instead of making more paperclips!
Somehow the word “predicted” in the title (as opposed to, say, “future” or “planned”) led me to expect entries for things like “OpenAI releases explicit model of human utility function” and “Entire mass of planet earth converted to paperclips”...
If both of those things happened I would be very interested in hearing about the person who decided to make a paperclip maximizer despite having an explicit model of human utility function they could implement.
Actually, I wouldn’t be interested in anything. I would be paperclips.
It hardly seems to make sense to implement a utility function for a paperclip plant, your AI would be focused on solving death and making people happy instead of making more paperclips!
Thanks for pointing that out. Do you have a suggestion for a less misleading title?
Timeline of AI Alignment meetings
Dunno. I don’t think the way it is does any actual harm. Maybe something with “meetings” in it, as per Teerth Aloke’s suggestion.
Just ‘meeting’ sounds too unimportant. But I’ve added it to the title, which removes the ambiguity.