My day to day life is populated with many who do not understand the lessons in this section. Interaction with these people is paramount in achieving my own goals; I am facing a situation in which the rational choice is to communicate irrationally. In specific, my colleagues and other associates seem to prefer “applause lights” and statements which offer no information. Therefore, attaining my personal, rationally selected goals might mean claiming irrational beliefs. I don’t think this is an explicit paradox, but it is an interesting point. There is a middle ground between “other-optimizing” (pointing out these applause lights as what they are) and changing my actual beliefs to those communicated by “applause lights”, but I do not believe it is tenable, and it may represent a conflict of goals (personal success in my field vs. spreading rational thought). Perhaps it is a microcosm of the precarious balance between self-optimization and world-optimization.
I make this sequence practical by making and addressing claims at less wrong, and trying to avoid ‘the less wrong community.’ Claims I can appraise for truth, beauty or strength. ‘Community’ may have those things, but it is not my interest. For example, demographics at less wrong are not my interest. Claims of members yes, male / female ratio not as much. Another example: effectiveness of altruism perhaps, altruism less so. I am aided by down votes that point out errors and my time is wasted by down votes that are community-based (‘that’s not how we do it here’).
it could be this post is about community and thus self-contradicts. Ah well.
How can the content of this sequence be made practical? Or, how do you plan to apply it in your day to day life?
My day to day life is populated with many who do not understand the lessons in this section. Interaction with these people is paramount in achieving my own goals; I am facing a situation in which the rational choice is to communicate irrationally. In specific, my colleagues and other associates seem to prefer “applause lights” and statements which offer no information. Therefore, attaining my personal, rationally selected goals might mean claiming irrational beliefs. I don’t think this is an explicit paradox, but it is an interesting point. There is a middle ground between “other-optimizing” (pointing out these applause lights as what they are) and changing my actual beliefs to those communicated by “applause lights”, but I do not believe it is tenable, and it may represent a conflict of goals (personal success in my field vs. spreading rational thought). Perhaps it is a microcosm of the precarious balance between self-optimization and world-optimization.
I make this sequence practical by making and addressing claims at less wrong, and trying to avoid ‘the less wrong community.’ Claims I can appraise for truth, beauty or strength. ‘Community’ may have those things, but it is not my interest. For example, demographics at less wrong are not my interest. Claims of members yes, male / female ratio not as much. Another example: effectiveness of altruism perhaps, altruism less so. I am aided by down votes that point out errors and my time is wasted by down votes that are community-based (‘that’s not how we do it here’).
it could be this post is about community and thus self-contradicts. Ah well.