Never claimed to be—I have long argued for the most effective communication techniques to promote EA ends.
Gleb_Tsipursky
I don’t believe I am wrong here. My rich uncle doesn’t read Less Wrong. However, those who have rich uncles do read Less Wrong. If I can sway even a single individual to communicate effectively, as opposed to maximizing transparency, in swaying people to give money effectively, I’ll be glad to have done so.
You seem to be suggesting that I had previously advocated being as transparent as possible. On the contrary—I have long advocated for the most effective communication techniques to achieve EA ends.
Sarah’s post highlights some of the essential tensions at the heart of Effective Altruism.
Do we care about “doing the most good that we can” or “being as transparent and honest as we can”? These are two different value sets. They will sometimes overlap, and in other cases will not.
And please don’t say that “we do the most good that we can by being as transparent and honest as we can” or that “being as transparent and honest as we can” is best in the long term. Just don’t. You’re simply lying to yourself and to everyone else if you say that. If you can’t imagine a scenario where “doing the most good that we can” or “being as transparent and honest as we can” are opposed, you’ve just suffered from a failure mode by flinching away from the truth.
So when push comes to shove, which one do we prioritize? When we have to throw the switch and have the trolley crush either “doing the most good” or “being as transparent and honest as we can,” which do we choose?
For a toy example, say you are talking to your billionaire uncle on his deathbed and trying to convince him to leave money to AMF instead of his current favorite charity, the local art museum. You know he would respond better if you exaggerate the impact of AMF. Would you do so, whether lying by omission or in any other way, in order to get much more money for AMF, given that no one else would find out about this situation? What about if you know that other family members are standing in the wings and ready to use all sorts of lies to advocate for their favorite charities?
If you do not lie, that’s fine, but don’t pretend that you care about doing the most good, please. Just don’t. You care about being as transparent and honest as possible over doing the most good.
If you do lie to your uncle, then you do care about doing the most good. However, you should consider at what price point you will not lie—at this point, we’re just haggling.
The people quoted in Sarah’s post all highlight how doing the most good sometimes involves not being as transparent and honest as we can (including myself). Different people have different price points, that’s all. We’re all willing to bite the bullet and sometimes send that trolley over transparency and honesty, whether questioning the value of public criticism such as Ben or appealing to emotions such as Rob or using intuition as evidence such as Jacy, for the sake of what we believe is the most good.
As a movement, EA has a big problem with believing that ends never justify the means. Yes, sometimes ends do justify the means—at least if we care about doing the most good. We can debate whether we are mistaken about the ends not justifying the means, but using insufficient means to accomplish the ends is just as bad as using excessive means to get to the ends. If we are truly serious about doing the most good as possible, we should let our end goal be the North Star, and work backward from there, as opposed to hobbling ourselves by preconceived notions of “intellectual rigor” at the cost of doing the most good.
Rationality 101 videotaped presentation with link to slides in description (from our LessWrong meetup introductory event)
Thank you!
This is probably too complex to hash out in comments—lots of semantics issues and some strategic/tactical information that might be best to avoid discussing publicly. If you’re interested in getting involved in the project and want to chat on Skype, email me at gleb [at] intentionalinsights [dot] org
We chose the issue of lies specifically because it is something a bunch of people can get behind opposing, across the political spectrum. Otherwise, we have to choose political virtues, and it’s always a trade-off. So the two fundamental orientations of this project are utilitarianism and anti-lies.
FYI, we plan to tackle sloppy thinking too, as I did in this piece, but that’s more complex, and it’s important to start with simple messages first. Heck, if we can get people to realize the simple difference between truth and comfort, I’d be happy.
Agreed with the issues around measuring lies, and noting the concession of the point—LW gold to you for highlighting the concession.
I hear you about “rationalism in politics.” The public-facing aspect of this project will be using terms like “post-lies movement” and so on. We’re using “Rational Politics” as the internal and provisional name for now, while we are gathering allies and spreading word about the project rather than doing much public outreach.
I’m talking about prioritizing the good of the country as a whole, not necessarily distant strangers—although in my personal value stance, that would be nice. Like I said, it’s an EA project :-)
At this point, I’m finished engaging with you, since you’re clearly not making statements based on reality. Good luck with growing more rational!
I’m going with the official definition of post-truth here, and am comfortable standing by it.
Nice, didn’t know that—thanks for pointing it out! Updated slightly on credibility of NYTimes on this basis.
I see the situation right now as more liberals being closer to rational thinking than more conservatives, but it hasn’t been the case in the past. I don’t know how this document would read if more conservatives were closer to rational thinking.
Regarding the Muslim issue, you might want to check out the radio interview I linked in the document. It shows very clearly how I got a conservative talk show host to update toward being nicer to Muslims.
If you’re interested in participating in this project, email me at gleb [at] intentionalinsights [dot] org
Agree that the attempts to rid academia of conservatives are bad.
Can you be comfortable saying that Trump lies more often, and more intensely, than prominent liberal politicians; usually does not back away from lies when called out; slams the credibility of those who call him out on lies; focuses on appealing to emotions over facts; tends to avoid providing evidence for assertions (such as that Russia was not behind the hack), etc.? This is what is meant by post-truth in Oxford Dictionary definition of this term.
Yup, agreed that it may well be not wise for those who have racist beliefs to be open about them. The same applies to the global warming stuff.
This is why I say this is a project informed by EA values—it comes from the perspective that voting is like donating thousands of dollars to charity and that voters care about the public good. It’s not meant to target those who don’t care about the public good—just those mistaken about what is the best way to achieve the public good. For instance, plenty of voters are mistaken about the state of reality, and some of those folks would genuinely want the most good. The project is not meant to reach all, in other words—just that select slice.
Yup, agreed that it may well be not worthwhile for voters who vote for reasons that are not oriented toward the most social good to vote rationally. This is why I say this is a project informed by EA values—it comes from the perspective that voting is like donating thousands of dollars to charity. For those who are purely self-interested, it’s really not rational to vote.
So to be clear, it’s not meant to target those who don’t care about the public good—just those mistaken about what is the best way to achieve the public good. For instance, plenty of voters are mistaken about the state of reality, and some of those folks would genuinely want the most good. The project is not meant to reach all, in other words—just that select slice.
I am comfortable with saying that my post is anti-post truth politics. I think most LWs would agree that Trump relies more on post-truth tactics than other politicians. Note that I also called out Democrats for doing so as well.
I have plenty of social status, and sufficient money, as a professor. I don’t need any more personally. In fact, I’ve donated about $38K to charity over the last 2 years. My goal is EA ends. You can choose to believe me or not :-)