Thanks! A big part of my motivation for writing this was to try to direct more attention to Max’s excellent, detailed, and important work on personal intent alignment. And I wanted to understand the group communication/epistemics that have kept it from being as impactful as it seems to me to deserve.
People who think seriously about AGI alignment seem to still mostly be thinking that we’ll try to value align AGI, even though that’s nigh impossible in the near term. And they shouldn’t.
Thanks! A big part of my motivation for writing this was to try to direct more attention to Max’s excellent, detailed, and important work on personal intent alignment. And I wanted to understand the group communication/epistemics that have kept it from being as impactful as it seems to me to deserve.
People who think seriously about AGI alignment seem to still mostly be thinking that we’ll try to value align AGI, even though that’s nigh impossible in the near term. And they shouldn’t.