It seems unlikely that we’d label something GAI unless it possesses (or rather appears to possess) its own “theory of mind” for the things with which it communicates. At that point I’d expect collaboration to arise in much the same ways that it does for humans, with many of the same warts. That presupposes we get something relatively equal in capability rather than something dog-like. If we get something god-like instead we may learn more about the dog’s point of view rather quickly.
I don’t think humans have collaboration as a default—it’s only because evolution was due to social pressure that this occurs at all, and it occurs primarily at the social-structure level, not as an outcome of individual effort.
Even if this is wrong, however, non-GAI systems can pose existential risks.
It seems unlikely that we’d label something GAI unless it possesses (or rather appears to possess) its own “theory of mind” for the things with which it communicates. At that point I’d expect collaboration to arise in much the same ways that it does for humans, with many of the same warts. That presupposes we get something relatively equal in capability rather than something dog-like. If we get something god-like instead we may learn more about the dog’s point of view rather quickly.
I don’t think humans have collaboration as a default—it’s only because evolution was due to social pressure that this occurs at all, and it occurs primarily at the social-structure level, not as an outcome of individual effort.
Even if this is wrong, however, non-GAI systems can pose existential risks.