It would be very surprising if all the substantial advice one had for me turned out identical to what merely mortal soft transhumanists with environmental and social concerns are already saying.
Obviously a human author cannot foresee what “substantial advice” a transhuman intelligence would actually offer; therefore the story’s god probably just functions as a mouthpiece for the author. I suggest you stop thinking about whether a superintelligent being would in fact say the things that character does, and start thinking about whether those things make sense on their own terms.
I know an authorial tract when I see one; but putting aside my basic distaste for the form and considering the short story as persuasive writing, I’d say it’s actually worse. It presents little reasoning, practically no evidence, and not even any especially interesting ideas. There’s nothing there but a sketch of a future history and a few generic environmental and social warnings, and I can get that without the smug overtones (well, without the same smug overtones, at least) on any left-leaning transhumanist blog.
Obviously a human author cannot foresee what “substantial advice” a transhuman intelligence would actually offer; therefore the story’s god probably just functions as a mouthpiece for the author. I suggest you stop thinking about whether a superintelligent being would in fact say the things that character does, and start thinking about whether those things make sense on their own terms.
You don’t say.
I know an authorial tract when I see one; but putting aside my basic distaste for the form and considering the short story as persuasive writing, I’d say it’s actually worse. It presents little reasoning, practically no evidence, and not even any especially interesting ideas. There’s nothing there but a sketch of a future history and a few generic environmental and social warnings, and I can get that without the smug overtones (well, without the same smug overtones, at least) on any left-leaning transhumanist blog.