I completey forgot about this interview, so I already knew why Greg Egan isn’t that worried:
… I think there’s a limit to this process of Copernican dethronement: I believe that humans have already crossed a threshold that, in a certain sense, puts us on an equal footing with any other being who has mastered abstract reasoning. There’s a notion in computing science of “Turing completeness”, which says that once a computer can perform a set of quite basic operations, it can be programmed to do absolutely any calculation that any other computer can do. Other computers might be faster, or have more memory, or have multiple processors running at the same time, but my 1988 Amiga 500 really could be programmed to do anything my 2008 iMac can do — apart from responding to external events in real time — if only I had the patience to sit and swap floppy disks all day long. I suspect that something broadly similar applies to minds and the class of things they can understand: other beings might think faster than us, or have easy access to a greater store of facts, but underlying both mental processes will be the same basic set of general-purpose tools. So if we ever did encounter those billion-year-old aliens, I’m sure they’d have plenty to tell us that we didn’t yet know — but given enough patience, and a very large notebook, I believe we’d still be able to come to grips with whatever they had to say.
He should try telling that to the Azetc, or better yet, the inhabitants of Hispaniola. Turns out that ten thousand years of divergence can mean instant death, no saving throw.
Greg Egan and the SIAI?
I completey forgot about this interview, so I already knew why Greg Egan isn’t that worried:
He should try telling that to the Azetc, or better yet, the inhabitants of Hispaniola. Turns out that ten thousand years of divergence can mean instant death, no saving throw.