I dunno, I think this is a pretty entertaining instance of anthropomorphizing + generalizing from oneself. At least in the future, I’ll be able to say things like “for example, Goertzel—a genuine AI researcher who has produced stuff—actually thinks that an intelligent AI can’t be designed to have an all-consuming interest in something like pi, despite all the real-world humans who are obsessed with pi!”
I dunno, I think this is a pretty entertaining instance of anthropomorphizing + generalizing from oneself. At least in the future, I’ll be able to say things like “for example, Goertzel—a genuine AI researcher who has produced stuff—actually thinks that an intelligent AI can’t be designed to have an all-consuming interest in something like pi, despite all the real-world humans who are obsessed with pi!”