I observe and assert that the type of mind that does this is a disproportionately important mind to influence with “rationalist” or SingInst memes.
From a Singularity perspective, the importance of rationality evangelism is being way overrated. There is still a tendency to mix up rationality and intelligence, as if becoming more rational will produce radically superior problem-solving skills. But if we’re talking about how to solve a problem like Friendly AI design, then what you need above all are people with high intelligence and relevant knowledge. “Aversion to lost purposes”, whatever that is, might be a trait of talented idealistic personalities who get distressed by dead hopes and organizational dysfunction, but some people learn early that that is normality, and their own progress is all the more streamlined for not fighting these facts of life.
In my opinion, the main source of the morale needed to sustain an effort like FAI research, in the midst of general indifference and incomprehension, is simply a sense among the protagonists that they are capable of solving the problem or of otherwise making a difference, and that derives in turn from a sense of one’s own abilities. If the objective is to solve the most difficult problems, and not just to improve the general quality of problem-solving in society, then rationality evangelism is a rather indiscriminate approach.
Agree that rationality evangelism (edit:) might be overrated, the importance is spreading the Friendliness-might-be-important memes far and apparently SingInst is using “rationality” as one of their memetic weapons of choice. I personally am not suggesting this memetic strategy is a well-thought-out one. “Aversion to lost purposes” totally doesn’t at all mean getting distressed because this world isn’t the should world, it means the thing that Eliezer talks about in his post “Lost Purposes”.
From a Singularity perspective, the importance of rationality evangelism is being way overrated. There is still a tendency to mix up rationality and intelligence, as if becoming more rational will produce radically superior problem-solving skills. But if we’re talking about how to solve a problem like Friendly AI design, then what you need above all are people with high intelligence and relevant knowledge. “Aversion to lost purposes”, whatever that is, might be a trait of talented idealistic personalities who get distressed by dead hopes and organizational dysfunction, but some people learn early that that is normality, and their own progress is all the more streamlined for not fighting these facts of life.
In my opinion, the main source of the morale needed to sustain an effort like FAI research, in the midst of general indifference and incomprehension, is simply a sense among the protagonists that they are capable of solving the problem or of otherwise making a difference, and that derives in turn from a sense of one’s own abilities. If the objective is to solve the most difficult problems, and not just to improve the general quality of problem-solving in society, then rationality evangelism is a rather indiscriminate approach.
Agree that rationality evangelism (edit:) might be overrated, the importance is spreading the Friendliness-might-be-important memes far and apparently SingInst is using “rationality” as one of their memetic weapons of choice. I personally am not suggesting this memetic strategy is a well-thought-out one. “Aversion to lost purposes” totally doesn’t at all mean getting distressed because this world isn’t the should world, it means the thing that Eliezer talks about in his post “Lost Purposes”.