Hm… Yeah, I think I can run with the notion that we would be able to kinda understand anything that a superintelligence was trying to convey to me on some level in a way that chimps would not grasp basic logic arguments (not sure how much logic some apes are able to grasp?). This actually made me think of one area where I could imagine such a difference between humans and AI: our motivational system feels capability wise similar to chimps language skills (or maybe that’s just me?), as there are “some” improvements knowledge/technology (self-help literature, stimulants, building institutions) gives you here, but at the end of the day all your tools won’t help if your stupid brain lost track of why you were doing anything in the first place.
Hm… Yeah, I think I can run with the notion that we would be able to kinda understand anything that a superintelligence was trying to convey to me on some level in a way that chimps would not grasp basic logic arguments (not sure how much logic some apes are able to grasp?). This actually made me think of one area where I could imagine such a difference between humans and AI: our motivational system feels capability wise similar to chimps language skills (or maybe that’s just me?), as there are “some” improvements knowledge/technology (self-help literature, stimulants, building institutions) gives you here, but at the end of the day all your tools won’t help if your stupid brain lost track of why you were doing anything in the first place.