[00:31:25] Timothy:… This is going to be like, they didn’t talk about any content, like there’s no specific evidence,
[00:31:48] Elizabeth: I wrote down my evidence ahead of time.
[00:31:49] Timothy: Yeah, you already wrote down your evidence
I feel pretty uncertain to what extent I agree with your views on EA. But this podcast didn’t really help me decide because there wasn’t much discussion of specific evidence. Where is all of it written down? I’m aware of your post on vegan advocacy but unclear if there are lots more examples. I also heard a similar line of despair about EA epistemics from other long-time rationalists when hanging around Lighthaven this summer. But basically no one brought up specific examples.
It seems difficult to characterize the EA movement as a monolith in the way you’re trying to do. The case of vegan advocacy is mostly irrelevant to my experience of EA. I have little contact with vegan advocates and most of the people I hang around in EA circles seem to have quite good epistemics.
However I can relate to your other example, because I’m one of the “baby EAs” who was vegetarian and was in the Lightcone offices in summer 2022. But my experience provides something of a counter-example. In fact, I became vegetarian before encountering EA and mostly found out about the potential nutritional problems from other EAs. When you wrote your post, I got myself tested for iron deficiency and started taking supplements (although not for iron deficiency). I eventually stopped being vegetarian, instead offsetting my impact with donations to animal charities, even though this isn’t very popular in EA circles.
My model is that people exist on a spectrum of weirdness to normy-ness. The weird people are often willing to pay social costs to be more truthful. While the more normy people will refrain from saying and thinking the difficult truths. But most people are mostly fixed at a certain point on the spectrum. The truth-seeking weirdos probably made up a larger proportion of the early EA movement, but I’d guess in absolute terms the number of those sorts of people hanging around EA spaces has not declined, and their epistemics have not degraded—there just aren’t very many of them in the world. But these days there is a greater number of the more normy people in EA circles too.
And yes, it dilutes the density of high epistemics in EA. But that doesn’t seem like a reason to abandon the movement. It is a sign that more people are being influenced by good ideas and that creates opportunities for the movement to do bigger things.
When you want to have interesting discussions with epistemic peers, you can still find your own circles within the movement to spend time with, and you can still come to the (relative) haven of LessWrong. If LessWrong culture also faced a similar decline in epistemic standards I would be much more concerned, but it has always felt like EA is the applied, consumer facing product of the rationalist movement, that targets real-world impact over absolute truth-seeking. For example, I think most EAs (and also some rationalists) are hopelessly confused about moral philosophy, but I’m still happy there’s more people trying to live by utilitarian principles, who might otherwise not be trying to maximize value at all.
I feel pretty uncertain to what extent I agree with your views on EA. But this podcast didn’t really help me decide because there wasn’t much discussion of specific evidence. Where is all of it written down? I’m aware of your post on vegan advocacy but unclear if there are lots more examples. I also heard a similar line of despair about EA epistemics from other long-time rationalists when hanging around Lighthaven this summer. But basically no one brought up specific examples.
It seems difficult to characterize the EA movement as a monolith in the way you’re trying to do. The case of vegan advocacy is mostly irrelevant to my experience of EA. I have little contact with vegan advocates and most of the people I hang around in EA circles seem to have quite good epistemics.
However I can relate to your other example, because I’m one of the “baby EAs” who was vegetarian and was in the Lightcone offices in summer 2022. But my experience provides something of a counter-example. In fact, I became vegetarian before encountering EA and mostly found out about the potential nutritional problems from other EAs. When you wrote your post, I got myself tested for iron deficiency and started taking supplements (although not for iron deficiency). I eventually stopped being vegetarian, instead offsetting my impact with donations to animal charities, even though this isn’t very popular in EA circles.
My model is that people exist on a spectrum of weirdness to normy-ness. The weird people are often willing to pay social costs to be more truthful. While the more normy people will refrain from saying and thinking the difficult truths. But most people are mostly fixed at a certain point on the spectrum. The truth-seeking weirdos probably made up a larger proportion of the early EA movement, but I’d guess in absolute terms the number of those sorts of people hanging around EA spaces has not declined, and their epistemics have not degraded—there just aren’t very many of them in the world. But these days there is a greater number of the more normy people in EA circles too.
And yes, it dilutes the density of high epistemics in EA. But that doesn’t seem like a reason to abandon the movement. It is a sign that more people are being influenced by good ideas and that creates opportunities for the movement to do bigger things.
When you want to have interesting discussions with epistemic peers, you can still find your own circles within the movement to spend time with, and you can still come to the (relative) haven of LessWrong. If LessWrong culture also faced a similar decline in epistemic standards I would be much more concerned, but it has always felt like EA is the applied, consumer facing product of the rationalist movement, that targets real-world impact over absolute truth-seeking. For example, I think most EAs (and also some rationalists) are hopelessly confused about moral philosophy, but I’m still happy there’s more people trying to live by utilitarian principles, who might otherwise not be trying to maximize value at all.
there are links in the description of the video