Schizophrenia is the wrong metaphor here—it’s not the same disease as split personalities (i.e. dissociative identity disorder). I think it would be clearer and more accurate to rewrite that paragraph without it. I don’t intend this as an attack or harsh criticism, it’s just that I have decided to be a pedant about this point whenever I encounter it, as I think it would be good for the general public to develop a more accurate and realistic understanding of schizophrenia.
Good point. I addition to that, using human diseases as a metaphor for AI misalignment is misleading, because it kinda implies that the default option is health; we only need to find and eliminate the potential causes of imbalance, and the health will happen naturally. While the very problem with AI is that there is no such thing as a natural good outcome. A perfectly healthy paperclip maximizer is still a disaster for humanity.
Schizophrenia is the wrong metaphor here—it’s not the same disease as split personalities (i.e. dissociative identity disorder). I think it would be clearer and more accurate to rewrite that paragraph without it. I don’t intend this as an attack or harsh criticism, it’s just that I have decided to be a pedant about this point whenever I encounter it, as I think it would be good for the general public to develop a more accurate and realistic understanding of schizophrenia.
Good point. I addition to that, using human diseases as a metaphor for AI misalignment is misleading, because it kinda implies that the default option is health; we only need to find and eliminate the potential causes of imbalance, and the health will happen naturally. While the very problem with AI is that there is no such thing as a natural good outcome. A perfectly healthy paperclip maximizer is still a disaster for humanity.