I second this. Or I guess, this post seems to be defining “fanatical” in a pretty narrow way, and then taking it for granted that only such a “fanatical” AI would wipe out humanity. But the post says “humans are not fanatical”, yet humans have wiped out many many other species.
Disclaimer: I have only skimmed the post, maybe the following already has a counterargument.
I think they are fanatical in the relevant sense. Like Yudkowsky says, he personally would go on colonizing all galaxies.
I second this. Or I guess, this post seems to be defining “fanatical” in a pretty narrow way, and then taking it for granted that only such a “fanatical” AI would wipe out humanity. But the post says “humans are not fanatical”, yet humans have wiped out many many other species.