Shoot. They did try at the beginning and thought they were recording. A few other points, additional to my other comment (these are half-remembered, rephrased, and presumably missing parts and context):
Anna hypothesizes that Geoff was selecting who he talked with and worked with and hired in part based on them being “not too big”, so that he could intellectually dominate them. She tells a story where she and Nate went (in 2017? 2019?) to talk with Geoff, and Anna+Nate thought talking was good or fruitful or something but Geoff seemed uninterested, and maybe that’s because Anna and Nate are “too big”.
Geoff describes trying, 2010±2, to get EA orgs to team up / combine, but finding lack of interest. I got a (speculative) bit of a sense of frame control battles going on in the shadows, like Geoff said something like “well when you think in detail about ambitious plans, you tend to see ways in which other people could fit into them”, and I could imagine his overtures having a subtle sense of trying to capture or define a frame, like some proportion of “here’s how you fit into my plan” rather than “here’s a common goal we’re aiming at, here’s synergies between our strategies, also let’s continuously double crux about crucial things”. (It would be bad to punish people for having ambitious plans that involve other people; it would be good to understand how to navigate “provisional plans” that can go in a direction and gain from coordination, while also remaining deeply open to members doing surprising things that upturn the plans, as well as not taking over their soul etc.) ETA: Anna’s comment here seems to be counterevidence: [Anna] was like “yes, that matches my memory and perception; I remember you [Geoff] and Leverage seeming unusually interested in getting specific collaborations or common projects that might support your goals + other groups’ goals at once, going, and more than other groups, and trying to support cooperation in this way”[...].
Geoff describes feeling very shut out unfairly by EA; there was a 2012(?) week-long EA summit run by, and at, Leverage, after which there was EA(+x-risk/rationality?) camaraderie, but then Leverage wasn’t allowed to have a table at some later EA event (maybe a later EA summit).
Geoff describes wanting to have an “exit strategy” from the rationality community, and describes getting “mixed messages”, like “please stay” and also something else like “you’re bad” or something.
Anna described “narrative addiction”, i.e. addiction to narratives. Used a speculative example/metaphor of an anorexic, who is somehow addicted to the control offered by a narrative, and so rejects information that pushes against the narrative. (I didn’t quite get this; something like, there’s a narrative in which being {good, attractive, healthy, non-selfish...?} can be controlled by not eating, and even if that’s false, it’s nice to think you can achieve those things? Where the analogy is like, thinking [the thing that I’m doing helps with X-risk / life-saving / etc.] is a narrative that one could be addicted to in the same way.) Anna hypothesizes that Leverage (as well as other EA/x-risk orgs) had narrative addiction. I’m curious what Leverage’s narratives were, in part because that seemed to play a significant role in Zoe’s experience.
Shoot. They did try at the beginning and thought they were recording. A few other points, additional to my other comment (these are half-remembered, rephrased, and presumably missing parts and context):
Anna hypothesizes that Geoff was selecting who he talked with and worked with and hired in part based on them being “not too big”, so that he could intellectually dominate them. She tells a story where she and Nate went (in 2017? 2019?) to talk with Geoff, and Anna+Nate thought talking was good or fruitful or something but Geoff seemed uninterested, and maybe that’s because Anna and Nate are “too big”.
Geoff describes trying, 2010±2, to get EA orgs to team up / combine, but finding lack of interest. I got a (speculative) bit of a sense of frame control battles going on in the shadows, like Geoff said something like “well when you think in detail about ambitious plans, you tend to see ways in which other people could fit into them”, and I could imagine his overtures having a subtle sense of trying to capture or define a frame, like some proportion of “here’s how you fit into my plan” rather than “here’s a common goal we’re aiming at, here’s synergies between our strategies, also let’s continuously double crux about crucial things”. (It would be bad to punish people for having ambitious plans that involve other people; it would be good to understand how to navigate “provisional plans” that can go in a direction and gain from coordination, while also remaining deeply open to members doing surprising things that upturn the plans, as well as not taking over their soul etc.) ETA: Anna’s comment here seems to be counterevidence: [Anna] was like “yes, that matches my memory and perception; I remember you [Geoff] and Leverage seeming unusually interested in getting specific collaborations or common projects that might support your goals + other groups’ goals at once, going, and more than other groups, and trying to support cooperation in this way”[...].
Geoff describes feeling very shut out unfairly by EA; there was a 2012(?) week-long EA summit run by, and at, Leverage, after which there was EA(+x-risk/rationality?) camaraderie, but then Leverage wasn’t allowed to have a table at some later EA event (maybe a later EA summit).
Geoff describes wanting to have an “exit strategy” from the rationality community, and describes getting “mixed messages”, like “please stay” and also something else like “you’re bad” or something.
Anna described “narrative addiction”, i.e. addiction to narratives. Used a speculative example/metaphor of an anorexic, who is somehow addicted to the control offered by a narrative, and so rejects information that pushes against the narrative. (I didn’t quite get this; something like, there’s a narrative in which being {good, attractive, healthy, non-selfish...?} can be controlled by not eating, and even if that’s false, it’s nice to think you can achieve those things? Where the analogy is like, thinking [the thing that I’m doing helps with X-risk / life-saving / etc.] is a narrative that one could be addicted to in the same way.) Anna hypothesizes that Leverage (as well as other EA/x-risk orgs) had narrative addiction. I’m curious what Leverage’s narratives were, in part because that seemed to play a significant role in Zoe’s experience.