You can always shape your public statements for one audience and end up driving away (or failing to convince) another one that’s more important.
This is of course true. I myself am fairly certain that SIAI’s public statements are driving away the people who it’s most important to interest in existential risk.
Suppose Eliezer hadn’t made that claim, and somebody asks him, “do you think the work SIAI is doing has higher expected value to humanity than what everybody else is doing?”, which somebody is bound to, given that Eliezer is asking for donations from rationalists. What is he supposed to say? “I can’t give you the answer because I don’t have enough evidence to convince a typical smart person?”
•It’s standard public relations practice to reveal certain information only if asked.
•An organization that has the strongest case for room for more funding need not be an organization that’s doing something of higher expected value to humanity than what everybody else is doing. In particular, I simultaneously believe that there are politicians who have higher expected value to humanity than all existential risk researchers alive and that the cause of existential risk has the greatest room for more funding.
•One need not be confident in one’s belief that funding one’s organization has highest expected value to humanity to believe that funding one’s organization has highest expected to humanity. A major issue that I have with Eliezer’s rhetoric is that he projects what I perceive to be an unreasonably high degree of confidence in his beliefs.
•Another major issue with Eliezer’s rhetoric that I have is that even putting issues of PR aside, I personally believe that funding SIAI does not have anywhere near the highest expected value to humanity out of all possible uses of money. So from my point of view, I see no upside to Eliezer making extreme claims of the sort that he has—it looks to me as though Eliezer is making false claims and damaging public relations for existential risk as a result.
I will be detailing my reasons for thinking that SIAI’s research does not have high expected value in a future post.
The level of certainty is not up for grabs. You are as confident as you happen to be, this can’t be changed. You can change the appearance, but not your actual level of confidence. And changing the apparent level of confidence is equivalent to lying.
But it isn’t perceived as so by the general public—it seems to me that the usual perception of “confidence” has more to do with status than with probability estimates.
The non-technical people I work with often say that I use “maybe” and “probably” too much (I’m a programmer—“it’ll probably work” is a good description of how often it does work in practice) - as if having confidence in one’s statements was a sign of moral fibre, and not a sign of miscalibration.
Actually, making statements with high confidence is a positive trait, but most people address this by increasing the confidence they express, not by increasing their knowledge until they can honestly make high-confidence statements. And our culture doesn’t correct for that, because errors of calibration are not immediatly obvious (as they would be if, say, we had a widespread habit of betting on various things).
Thanks for your feedback. Several remarks:
This is of course true. I myself am fairly certain that SIAI’s public statements are driving away the people who it’s most important to interest in existential risk.
•It’s standard public relations practice to reveal certain information only if asked.
•An organization that has the strongest case for room for more funding need not be an organization that’s doing something of higher expected value to humanity than what everybody else is doing. In particular, I simultaneously believe that there are politicians who have higher expected value to humanity than all existential risk researchers alive and that the cause of existential risk has the greatest room for more funding.
•One need not be confident in one’s belief that funding one’s organization has highest expected value to humanity to believe that funding one’s organization has highest expected to humanity. A major issue that I have with Eliezer’s rhetoric is that he projects what I perceive to be an unreasonably high degree of confidence in his beliefs.
•Another major issue with Eliezer’s rhetoric that I have is that even putting issues of PR aside, I personally believe that funding SIAI does not have anywhere near the highest expected value to humanity out of all possible uses of money. So from my point of view, I see no upside to Eliezer making extreme claims of the sort that he has—it looks to me as though Eliezer is making false claims and damaging public relations for existential risk as a result.
I will be detailing my reasons for thinking that SIAI’s research does not have high expected value in a future post.
The level of certainty is not up for grabs. You are as confident as you happen to be, this can’t be changed. You can change the appearance, but not your actual level of confidence. And changing the apparent level of confidence is equivalent to lying.
But it isn’t perceived as so by the general public—it seems to me that the usual perception of “confidence” has more to do with status than with probability estimates.
The non-technical people I work with often say that I use “maybe” and “probably” too much (I’m a programmer—“it’ll probably work” is a good description of how often it does work in practice) - as if having confidence in one’s statements was a sign of moral fibre, and not a sign of miscalibration.
Actually, making statements with high confidence is a positive trait, but most people address this by increasing the confidence they express, not by increasing their knowledge until they can honestly make high-confidence statements. And our culture doesn’t correct for that, because errors of calibration are not immediatly obvious (as they would be if, say, we had a widespread habit of betting on various things).
That a lie is likely to be misinterpreted or not noticed doesn’t make it not a lie, and conversely.
Oh, I fully agree with your point; it’s a pity that high confidence on unusual topics is interpreted as arrogance.
Try this: I prefer my leaders to be confident. I prefer my subordinates to be truthful.