Basically insofar as EA is screwed up, its mostly caused by bad systems not bad people, as far as I can tell.
Insofar as you’re thinking I said bad people, please don’t let yourself make that mistake, I said bad values.
There are occasional bad people like SBF but that’s not what I’m talking about here. I’m talking about a lot of perfectly kind people who don’t hold the values of integrity and truth-seeking as part of who they are, and who couldn’t give a good account for why many rationalists value those things so much (and might well call rationalists weird and autistic if you asked them to try).
I don’t think differences in values explain much of the differences in results—sure, truthseeking vs impact can hypothetically lead one in different directions, but in practice I think most EAs and rationalists are extremely value aligned
This is a crux. I acknowledge I probably share more values with a random EA than a random university student, but I don’t think that’s actually saying that much, and I believe there’s a lot of massively impactful difference in culture and values.
I’m pushing back against Tsvi’s claims that “some people don’t care” or “EA recruiters would consciously choose 2 zombies over 1 agent”—I think ascribing bad intentions to individuals ends up pretty mindkilly
I think EA recruiters have repeatedly made decisions like choosing 2 zombies over 1 agent, and were I or Tsvi to look at the same set of options and information we would have made a different decision, because we’ve learned to care about candor and wholesomeness and respecting other people’s sense-making and a bunch of other things. I don’t think this makes them bad people. Having good values takes a lot of work by a lot of people to encapsulate and teach them, a good person should not be expected to re-derive an entire culture for themselves, and I think most of the world does not teach all of the values I care about to people by the age of 18, like lightness and argument and empiricism and integrity and courage and more. They don’t care about a number of the values that I hold, and as a result will make decisions counter to those values.
This is a crux. I acknowledge I probably share more values with a random EA than a random university student, but I don’t think that’s actually saying that much, and I believe there’s a lot of massively impactful difference in culture and values.
My best guess is something like a third of rationalists are also EAs, at least going by identification. (I’m being lazy for the moment and not cross checking “Identifies as Rationalist” against “Identifies as EA” but I can if you want me to and I’m like 85% sure the less-lazy check will bear that out.) My educated but irresponsible guess is something like 10% of EAs are rationalists. Last time I did a straw poll at an ACX meetup, more than half the people attending were also EAs. Whatever the differences are, it’s not stopping a substantial overlap on membership, and I don’t think that’s just at the level of random members but includes a lot of the notable members.
I’d be pretty open to a definition of ‘rationalist’ that was about more than self-identification, but to my knowledge we don’t have a workable definition better than that. It’s plausible to me that the differences matter as you lean on them a lot, but I think it’s more likely the two groups are aligned for most purposes.
Thanks for the data! I agree there’s a fair bit of overlap in clusters of people.
Two points:
I am talking about the cultural values more than simply the individuals. I think a person’s environment really brings very different things out of them. The same person(s) working at Amazon, DC politics, and a global-health non-profit, will get invited to live out different values and build quite different identities for themselves. The same person in-person and on Twitter can also behave as quite different people. I think LessWrong has a distinct culture from the EA Forum, and I think EAG has a distinct culture from ACX meetups.
Not every person in a scene strongly embodies the ideals and aspirations of that scene. Many people who come to rationalist meetups I have yet to get on the same page about with lots of values e.g. I still somewhat regularly have to give arguments against various reasons for why people sometimes endorse self-deception, even to folks who have been around for many years. The ideals of EA and LW are different.
So even though the two scenes have overlap in people, I still think the scenes live out and aspire to different values and different cultures, and this explains a lot of difference in outcomes.
Insofar as you’re thinking I said bad people, please don’t let yourself make that mistake, I said bad values.
I appreciate you drawing the distinction! The bit about “bad people” was more directed at Tsvi, or possibly the voters who agreevoted with Tsvi.
There’s a lot of massively impactful difference in culture and values
Mm, I think if the question is “what accounts for the differences between the EA and rationalist movements today, wrt number of adherents, reputation, amount of influence, achievements” I would assign credit in the ratio of ~1:3 to differences in (values held by individuals):systems. Where systems are roughly: how the organizations are set up, how funding and information flows through the ecosystem.
(As I write this, I realize that maybe even caring about adherents/reputation/influence/achievement in the first place is an impact-based, EA-frame, and the thing that Ben cares about is more like “what accounts for the differences in their philosophies or gestalt of what it feels like to be in the movement”; I feel like I’m lowkey failing an ITT here...)
Mm, I think if the question is “what accounts for the differences between the EA and rationalist movements today, wrt number of adherents, reputation, amount of influence, achievements” I would assign credit in the ratio of ~1:3 to differences in (values held by individuals):systems. Where systems are roughly: how the organizations are set up, how funding and information flows through the ecosystem.
I can think about that question if it seems relevant, but the initial claim of Elizabeth’s was “I believe there are ways to recruit college students responsibly. I don’t believe the way EA is doing it really has a chance to be responsible”. So I was trying to give an account of the root cause there.
Also — and I recognize that I’m saying something relatively trivial here — the root cause of a problem in a system can of course be any seemingly minor part of it. Just because I’m saying one part of the system is causing problems (the culture’s values) doesn’t mean I’m saying that’s what’s primarily responsible for the output. The current cause of a software company’s current problems might be the slow speed with which PR reviews are happening, but this shouldn’t be mistaken for the claim that the credit allocation for the company’s success is primarily that it can do PR reviews fast.
So to repeat, I’m saying that IMO the root cause of irresponsible movement growth and ponzi-scheme-like recruitment strategies was a lack of IMO very important values like dialogue and candor and respecting other people’s sense-making and courage and so on, rather than an explanation more like ‘those doing recruitment had poor feedback loops so had a hard time knowing what tradeoffs to make’ (my paraphrase of your suggestion).
I would have to think harder about which specific values I believe caused this particular issue, but that’s my broad point.
Insofar as you’re thinking I said bad people, please don’t let yourself make that mistake, I said bad values.
There are occasional bad people like SBF but that’s not what I’m talking about here. I’m talking about a lot of perfectly kind people who don’t hold the values of integrity and truth-seeking as part of who they are, and who couldn’t give a good account for why many rationalists value those things so much (and might well call rationalists weird and autistic if you asked them to try).
This is a crux. I acknowledge I probably share more values with a random EA than a random university student, but I don’t think that’s actually saying that much, and I believe there’s a lot of massively impactful difference in culture and values.
I think EA recruiters have repeatedly made decisions like choosing 2 zombies over 1 agent, and were I or Tsvi to look at the same set of options and information we would have made a different decision, because we’ve learned to care about candor and wholesomeness and respecting other people’s sense-making and a bunch of other things. I don’t think this makes them bad people. Having good values takes a lot of work by a lot of people to encapsulate and teach them, a good person should not be expected to re-derive an entire culture for themselves, and I think most of the world does not teach all of the values I care about to people by the age of 18, like lightness and argument and empiricism and integrity and courage and more. They don’t care about a number of the values that I hold, and as a result will make decisions counter to those values.
My best guess is something like a third of rationalists are also EAs, at least going by identification. (I’m being lazy for the moment and not cross checking “Identifies as Rationalist” against “Identifies as EA” but I can if you want me to and I’m like 85% sure the less-lazy check will bear that out.) My educated but irresponsible guess is something like 10% of EAs are rationalists. Last time I did a straw poll at an ACX meetup, more than half the people attending were also EAs. Whatever the differences are, it’s not stopping a substantial overlap on membership, and I don’t think that’s just at the level of random members but includes a lot of the notable members.
I’d be pretty open to a definition of ‘rationalist’ that was about more than self-identification, but to my knowledge we don’t have a workable definition better than that. It’s plausible to me that the differences matter as you lean on them a lot, but I think it’s more likely the two groups are aligned for most purposes.
Thanks for the data! I agree there’s a fair bit of overlap in clusters of people.
Two points:
I am talking about the cultural values more than simply the individuals. I think a person’s environment really brings very different things out of them. The same person(s) working at Amazon, DC politics, and a global-health non-profit, will get invited to live out different values and build quite different identities for themselves. The same person in-person and on Twitter can also behave as quite different people. I think LessWrong has a distinct culture from the EA Forum, and I think EAG has a distinct culture from ACX meetups.
Not every person in a scene strongly embodies the ideals and aspirations of that scene. Many people who come to rationalist meetups I have yet to get on the same page about with lots of values e.g. I still somewhat regularly have to give arguments against various reasons for why people sometimes endorse self-deception, even to folks who have been around for many years. The ideals of EA and LW are different.
So even though the two scenes have overlap in people, I still think the scenes live out and aspire to different values and different cultures, and this explains a lot of difference in outcomes.
I appreciate you drawing the distinction! The bit about “bad people” was more directed at Tsvi, or possibly the voters who agreevoted with Tsvi.
Mm, I think if the question is “what accounts for the differences between the EA and rationalist movements today, wrt number of adherents, reputation, amount of influence, achievements” I would assign credit in the ratio of ~1:3 to differences in (values held by individuals):systems. Where systems are roughly: how the organizations are set up, how funding and information flows through the ecosystem.
(As I write this, I realize that maybe even caring about adherents/reputation/influence/achievement in the first place is an impact-based, EA-frame, and the thing that Ben cares about is more like “what accounts for the differences in their philosophies or gestalt of what it feels like to be in the movement”; I feel like I’m lowkey failing an ITT here...)
I can think about that question if it seems relevant, but the initial claim of Elizabeth’s was “I believe there are ways to recruit college students responsibly. I don’t believe the way EA is doing it really has a chance to be responsible”. So I was trying to give an account of the root cause there.
Also — and I recognize that I’m saying something relatively trivial here — the root cause of a problem in a system can of course be any seemingly minor part of it. Just because I’m saying one part of the system is causing problems (the culture’s values) doesn’t mean I’m saying that’s what’s primarily responsible for the output. The current cause of a software company’s current problems might be the slow speed with which PR reviews are happening, but this shouldn’t be mistaken for the claim that the credit allocation for the company’s success is primarily that it can do PR reviews fast.
So to repeat, I’m saying that IMO the root cause of irresponsible movement growth and ponzi-scheme-like recruitment strategies was a lack of IMO very important values like dialogue and candor and respecting other people’s sense-making and courage and so on, rather than an explanation more like ‘those doing recruitment had poor feedback loops so had a hard time knowing what tradeoffs to make’ (my paraphrase of your suggestion).
I would have to think harder about which specific values I believe caused this particular issue, but that’s my broad point.