“”″
“Out of curiosity,” said the Lord Pilot, “have they ever tried to produce even more babies—say, thousands instead of hundreds—so they could speed up their evolution even more?”
“It ought to be easily within their current capabilities of bioengineering,” said the Xenopsychologist, “and yet they haven’t done it. Still, I don’t think we should make the suggestion.”″
“Agreed,” said Akon.
“”″
That’s not the least bit obvious. Do we really want the Babyeaters to hold back corresponding suggestions that might make our society better from our perspective and worse from theirs?
If, in this situation, we ought to bite the prisoner’s-dilemma bullet to the degree of not invading the Babyeater planet because peaceful situations are, on average, better than war-torn situations, doesn’t the same argument mean that we shouldn’t hold back helpful advice, provided that, on the whole, situations in which helpful advice is given freely are better?
Now maybe it’s the case that if we swapped that particular kind of helpful advice with the baby eaters, the degree to which Babyeater planet got worse by our standards is more than the degree to which our planet would bet better by our standards, and vise versa. But in that case it would be better for both sides to draw up a treaty....
“”″ “Out of curiosity,” said the Lord Pilot, “have they ever tried to produce even more babies—say, thousands instead of hundreds—so they could speed up their evolution even more?”
“It ought to be easily within their current capabilities of bioengineering,” said the Xenopsychologist, “and yet they haven’t done it. Still, I don’t think we should make the suggestion.”″
“Agreed,” said Akon. “”″
That’s not the least bit obvious. Do we really want the Babyeaters to hold back corresponding suggestions that might make our society better from our perspective and worse from theirs?
If, in this situation, we ought to bite the prisoner’s-dilemma bullet to the degree of not invading the Babyeater planet because peaceful situations are, on average, better than war-torn situations, doesn’t the same argument mean that we shouldn’t hold back helpful advice, provided that, on the whole, situations in which helpful advice is given freely are better?
Now maybe it’s the case that if we swapped that particular kind of helpful advice with the baby eaters, the degree to which Babyeater planet got worse by our standards is more than the degree to which our planet would bet better by our standards, and vise versa. But in that case it would be better for both sides to draw up a treaty....