Good point. I addition to that, using human diseases as a metaphor for AI misalignment is misleading, because it kinda implies that the default option is health; we only need to find and eliminate the potential causes of imbalance, and the health will happen naturally. While the very problem with AI is that there is no such thing as a natural good outcome. A perfectly healthy paperclip maximizer is still a disaster for humanity.
Good point. I addition to that, using human diseases as a metaphor for AI misalignment is misleading, because it kinda implies that the default option is health; we only need to find and eliminate the potential causes of imbalance, and the health will happen naturally. While the very problem with AI is that there is no such thing as a natural good outcome. A perfectly healthy paperclip maximizer is still a disaster for humanity.