I don’t mean to dismiss the points of this post, but all of those points do need to be reinterpreted in light of the fact that I’d rather have a few really good rationalists as allies than a lot of mediocre rationalists who think “oh, cool” and don’t do anything about it. Consider me as being systematically concerned with the top 5% rather than the average case. However, I do still care about things like propagation velocities because that affects what population size the top 5% is 5% of, for example.
Somewhere you said that you are really happy to be finally able to concentrate directly on the matters you deem important and don’t have to raise money anymore. This obviously worked, so you won’t have to change anything. But if you ever need to raise more money for a certain project, my question is how much of the money you already get comes from people you would consider mediocre rationalists?
I’m not sure if you expect to ever need a lot of money for a SIAI project, but if you solely rely on those few really good rationalists then you might have a hard time in that case.
People like me will probably always stay on your side, whether you tell them they are idiots. But I’m not sure if that might be enough in a scenario where donations are important.
Agree with the points of both of ChistianKI and XiXiDu.
As for really good rationalists, I have the impression that even when it comes to them you inadvertently alienate them with higher than usual frequency on account of saying things that sound quite strange.
I think (but am not sure) that you would benefit from spending more time understanding what goes on in neurotypical people’s minds. This would carry not only social benefits (which you may no longer need very much at this point) but also epistemological benefits.
However, I do still care about things like propagation velocities because that affects what population size the top 5% is 5% of, for example.
If we think existential risk reduction is important than we should care about whether politicians think that existential risk reduction is a good idea.
I don’t think that a substantial number of US congressman are what you consider to be good rationalists.
For Congress to implement good policy in this area would be performance vastly exceeding what we’ve previously seen from them. They called prediction markets terror markets. I expect more of the same, and expect to have little effect on them.
The flipside though is if we can frame the issue in a way that there’s no obvious Democrat or Republican position, then we can, as Robin Hanson puts it, “pull the rope sideways”.
The very fact that much of the existential risk stuff is “strange sounding” relative to what most people are used to really thinking about in the context of political arguments might thus act as a positive.
We live in a democracy! How can you not be concerned with 95% of the population? They rule you.
If we lived in some sort of meritocratic aristocracy, perhaps then we could focus our efforts on only the smartest 5%.
As it is, it’s the 95% who decide what happens in our elections, and its our elections who decides what rules get made, what projects get funded. The President of the United States could unleash nuclear war at any time. He’s not likely to—but he could. And if he did push that button, it’s over, for all of us. So we need to be very concerned about who is in charge of that button, and that means we need to be very concerned about the people who elect him.
Right now, 46% of them think the Earth is 6000 years old. This worldview comes with a lot of other anti-rationalist baggage like faith and the Rapture. And it runs our country. Is it just me, or does this seem like a serious problem, one that we should probably be working to fix?
I don’t mean to dismiss the points of this post, but all of those points do need to be reinterpreted in light of the fact that I’d rather have a few really good rationalists as allies than a lot of mediocre rationalists who think “oh, cool” and don’t do anything about it. Consider me as being systematically concerned with the top 5% rather than the average case. However, I do still care about things like propagation velocities because that affects what population size the top 5% is 5% of, for example.
Somewhere you said that you are really happy to be finally able to concentrate directly on the matters you deem important and don’t have to raise money anymore. This obviously worked, so you won’t have to change anything. But if you ever need to raise more money for a certain project, my question is how much of the money you already get comes from people you would consider mediocre rationalists?
I’m not sure if you expect to ever need a lot of money for a SIAI project, but if you solely rely on those few really good rationalists then you might have a hard time in that case.
People like me will probably always stay on your side, whether you tell them they are idiots. But I’m not sure if that might be enough in a scenario where donations are important.
Agree with the points of both of ChistianKI and XiXiDu.
As for really good rationalists, I have the impression that even when it comes to them you inadvertently alienate them with higher than usual frequency on account of saying things that sound quite strange.
I think (but am not sure) that you would benefit from spending more time understanding what goes on in neurotypical people’s minds. This would carry not only social benefits (which you may no longer need very much at this point) but also epistemological benefits.
I’m encouraged by this remark.
If we think existential risk reduction is important than we should care about whether politicians think that existential risk reduction is a good idea. I don’t think that a substantial number of US congressman are what you consider to be good rationalists.
For Congress to implement good policy in this area would be performance vastly exceeding what we’ve previously seen from them. They called prediction markets terror markets. I expect more of the same, and expect to have little effect on them.
The flipside though is if we can frame the issue in a way that there’s no obvious Democrat or Republican position, then we can, as Robin Hanson puts it, “pull the rope sideways”.
The very fact that much of the existential risk stuff is “strange sounding” relative to what most people are used to really thinking about in the context of political arguments might thus act as a positive.
We live in a democracy! How can you not be concerned with 95% of the population? They rule you.
If we lived in some sort of meritocratic aristocracy, perhaps then we could focus our efforts on only the smartest 5%.
As it is, it’s the 95% who decide what happens in our elections, and its our elections who decides what rules get made, what projects get funded. The President of the United States could unleash nuclear war at any time. He’s not likely to—but he could. And if he did push that button, it’s over, for all of us. So we need to be very concerned about who is in charge of that button, and that means we need to be very concerned about the people who elect him.
Right now, 46% of them think the Earth is 6000 years old. This worldview comes with a lot of other anti-rationalist baggage like faith and the Rapture. And it runs our country. Is it just me, or does this seem like a serious problem, one that we should probably be working to fix?