Hello all. I don’t think I identify myself as a “rationalist” exactly—I think of rationality more as a mode of thought (for example, when singing or playing a musical instrument, that is a different mode of thought, and there are many different modes of thought that are natural and appropriate for us human animals). It is a very useful mode of thought, though, and worth cultivating. It does strike me that the goals targeted by “Instrumental Rationality” are only weakly related to what I would consider “rationality” and for most people things like focus, confidence, and other similar skills far surpass things like Bayesian update for the practical achievement of goals. I also fear that our poor ability to gauge priors very often makes human-Bayesianism provide more of the appearance of rationality than actual improvement in tangible success in day-to-day reasoning.
Still, there’s no denying that epistemic and instrumental rationality drive much of what we call “progress” for humanity and the more skilled we are in their use, the better. I would like to improve my own world-modeling skills.
I am also very interested in a particular research program that is not presently an acceptable topic of conversation. Since that program has no active discussion forum anywhere else (odd given how important many people here think it to be), I am hopeful that in time it will become an active topic—as “rationality incarnate” if nothing else.
I thank all of the authors here for providing interesting material and hope to contribute myself, at least a little.
Oh, I’m a 45-year-old male software designer and researcher working for a large computer security company.
Hello all. I don’t think I identify myself as a “rationalist” exactly—I think of rationality more as a mode of thought (for example, when singing or playing a musical instrument, that is a different mode of thought, and there are many different modes of thought that are natural and appropriate for us human animals). It is a very useful mode of thought, though, and worth cultivating. It does strike me that the goals targeted by “Instrumental Rationality” are only weakly related to what I would consider “rationality” and for most people things like focus, confidence, and other similar skills far surpass things like Bayesian update for the practical achievement of goals. I also fear that our poor ability to gauge priors very often makes human-Bayesianism provide more of the appearance of rationality than actual improvement in tangible success in day-to-day reasoning.
Still, there’s no denying that epistemic and instrumental rationality drive much of what we call “progress” for humanity and the more skilled we are in their use, the better. I would like to improve my own world-modeling skills.
I am also very interested in a particular research program that is not presently an acceptable topic of conversation. Since that program has no active discussion forum anywhere else (odd given how important many people here think it to be), I am hopeful that in time it will become an active topic—as “rationality incarnate” if nothing else.
I thank all of the authors here for providing interesting material and hope to contribute myself, at least a little.
Oh, I’m a 45-year-old male software designer and researcher working for a large computer security company.