Those seem like really important distinctions. I have the feeling people who don’t think AI Alignment is super important either implicitly or explicitly only think of parochial alignment and not of holistic alignment and just don’t consider the former to be so difficult.
People who don’t know much about AI have two templates, neither of which is AI: the first is a conventional computer, the second is a human being. Following the first template it seems obvious that AGIs will be obedient and slavish.
Those seem like really important distinctions. I have the feeling people who don’t think AI Alignment is super important either implicitly or explicitly only think of parochial alignment and not of holistic alignment and just don’t consider the former to be so difficult.
People who don’t know much about AI have two templates, neither of which is AI: the first is a conventional computer, the second is a human being. Following the first template it seems obvious that AGIs will be obedient and slavish.