When I am speaking to people about rationality or AI, and they ask something incomprehensibly bizarre and incoherent, I am often tempted to give the reply that Charles Babbage gave to those who asked him whether a machine that was given bad data would produce the right answers anyway:
I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.
But instead I say, “Yes, that’s an important question...” and then I steel-man their question, or I replace it with a question on an entirely different subject that happens to share some of the words from their original question, and I answer that question instead.
When I am speaking to people about rationality or AI, and they ask something incomprehensibly bizarre and incoherent, I am often tempted to give the reply that Charles Babbage gave to those who asked him whether a machine that was given bad data would produce the right answers anyway:
But instead I say, “Yes, that’s an important question...” and then I steel-man their question, or I replace it with a question on an entirely different subject that happens to share some of the words from their original question, and I answer that question instead.
What does this mean?
Steel man
See also.
Thank you!