That doesn’t answer my question. Again, what rationalist/AI mistake would I not make as a result of reading GEB that could not be achieved with something shorter?
But would Eliezer view it as that durn good (i.e. it being a tragedy that people die without reading it) if it were just entertaining fluff with no insights to AI and rationality?
I’m not Eliezer, and perhaps not being an AGI researcher means that my answer is irrelevant, but I think that things can have a deep aesthetic value or meaning from which one could gain insights into things more important than AI or rationality. One of these things may be the ‘something to protect’ that Eliezer wrote about. Others may be intrinsic values to discover, to give your rationality purpose. If I could only keep one of a copy of the Gospels of Buddha or a copy of MITECS, I would keep the Gospels of Buddha, because it reminds me of the importance of terminal values like compassion. When I read GEB the ideas of interconnectedness, of patterns, and of meaning all left me with a clearer thought process than did reading Eliezer’s short paper on Coherent Extrapolated Volition, which was enjoyable but just didn’t seem to resonate in the same way. Calling these things ‘entertaining fluff’ may be losing sight of Eliezer’s 11th virtue: “The Art must have a purpose other than itself, or it collapses into infinite recursion.” That is all, of course, my humble opinion. Maybe having everyone read about and understand the dangers of black swans and unfriendly AI would be more productive than having them read about and understand the values of compassion and altruism; for if people do not understand the former, there may be no world left for the latter.
That doesn’t answer my question. Again, what rationalist/AI mistake would I not make as a result of reading GEB that could not be achieved with something shorter?
As I said, there is not necessarily any kind of rationalist/AI content in GEB directly relevant to us. It could be well just simply a good book.
But would Eliezer view it as that durn good (i.e. it being a tragedy that people die without reading it) if it were just entertaining fluff with no insights to AI and rationality?
I’m not Eliezer, and perhaps not being an AGI researcher means that my answer is irrelevant, but I think that things can have a deep aesthetic value or meaning from which one could gain insights into things more important than AI or rationality. One of these things may be the ‘something to protect’ that Eliezer wrote about. Others may be intrinsic values to discover, to give your rationality purpose. If I could only keep one of a copy of the Gospels of Buddha or a copy of MITECS, I would keep the Gospels of Buddha, because it reminds me of the importance of terminal values like compassion. When I read GEB the ideas of interconnectedness, of patterns, and of meaning all left me with a clearer thought process than did reading Eliezer’s short paper on Coherent Extrapolated Volition, which was enjoyable but just didn’t seem to resonate in the same way. Calling these things ‘entertaining fluff’ may be losing sight of Eliezer’s 11th virtue: “The Art must have a purpose other than itself, or it collapses into infinite recursion.”
That is all, of course, my humble opinion. Maybe having everyone read about and understand the dangers of black swans and unfriendly AI would be more productive than having them read about and understand the values of compassion and altruism; for if people do not understand the former, there may be no world left for the latter.