The “Why” in “why are you doing it” could be interpreted as “for what purpose,” or “as a result of what causal chain.” Neither of these, at first blush, appears all that fundamental or difficult—but perhaps there’s another sense I’m missing.
The “Why” in “why are you doing it” could be interpreted as “for what purpose,” or “as a result of what causal chain.”
Certainly. What’s important, however, is that the process of repeating the “why?” question forces you to to 1) think about what it is that you’re doing, in detail, 2) understand what ends these actions serve, and 3) confront the beliefs that make these ends seem desirable in the first place. In effect, asking “what are you doing, and why are you doing it?” forces you to look at not only what you believe, but whether or not your actions are in alignment with your beliefs. For example:
What are you doing?
I am studying information theory, Bayesian statistics and neuroscience.
and why are you doing that?
I am trying to understand how the brain works, and it appears that the former two areas of mathematics are useful tools in formulating theories about the brain. (Notice I’ve already had to confront a belief here. A “why do you believe this?” question should go here.)
and why are you doing that?
1) It is a very interesting problem. 2) Ultimately, having a good theory of the brain will likely contribute to both AI and WBE technologies (belief!), both of which I view as necessary to confront issues that will likely arise in the world as the world population increases and as more and more dangerous technologies get developed (synthetic biology, nanotechnology, etc.) (a tangled network of beliefs, here, all of which need explaining).
If I were to unravel this further, I would have to confront the fact that AI is itself a dangerous technology, so I should address whether my current course of action results in a net positive or net negative impact on the chances of a beneficial Singularity (there’s a belief implicit here: that the Singularity is plausible enough to warrant thinking about. This too requires explanation).
Of course, this process quickly gets messy, but in my view “what are you doing and why are you doing it” is of fundamental importance to any rationalist.
The “Why” in “why are you doing it” could be interpreted as “for what purpose,” or “as a result of what causal chain.” Neither of these, at first blush, appears all that fundamental or difficult—but perhaps there’s another sense I’m missing.
Certainly. What’s important, however, is that the process of repeating the “why?” question forces you to to 1) think about what it is that you’re doing, in detail, 2) understand what ends these actions serve, and 3) confront the beliefs that make these ends seem desirable in the first place. In effect, asking “what are you doing, and why are you doing it?” forces you to look at not only what you believe, but whether or not your actions are in alignment with your beliefs. For example:
What are you doing?
I am studying information theory, Bayesian statistics and neuroscience.
and why are you doing that?
I am trying to understand how the brain works, and it appears that the former two areas of mathematics are useful tools in formulating theories about the brain. (Notice I’ve already had to confront a belief here. A “why do you believe this?” question should go here.)
and why are you doing that?
1) It is a very interesting problem. 2) Ultimately, having a good theory of the brain will likely contribute to both AI and WBE technologies (belief!), both of which I view as necessary to confront issues that will likely arise in the world as the world population increases and as more and more dangerous technologies get developed (synthetic biology, nanotechnology, etc.) (a tangled network of beliefs, here, all of which need explaining).
If I were to unravel this further, I would have to confront the fact that AI is itself a dangerous technology, so I should address whether my current course of action results in a net positive or net negative impact on the chances of a beneficial Singularity (there’s a belief implicit here: that the Singularity is plausible enough to warrant thinking about. This too requires explanation).
Of course, this process quickly gets messy, but in my view “what are you doing and why are you doing it” is of fundamental importance to any rationalist.