If EA focused more on feedback loops, then there’d be less focus on donating money to charity. How would you like these resources to be deployed instead?
Become less confused about how the world works and what strategies are effective for doing things in it. Reading from a variety of fields (math, history, decision theory, economics, etc) is a good way to start, as is doing first-principles analyses and writing about them, and visiting places you think might be important to investigate.
Find friends you can talk to about this, and who you can coordinate with.
Make one or more plans for causing nice things to happen, which will probably involve environmentally sustainable positive feedback loops with positive externalities.
Execute on these plans until they becomes obsolete.
Not everyone has to do all these things, some can support people doing this, or be one member of a group that does this. Also, some people don’t care enough to actually do these things, and probably these people shouldn’t do strategy, and should instead do things they actually care about.
I basically agree with this approach. I have sometimes said that if I could change one thing about EA, it would be that I want more EAs to feel like there job is to understand the world and how the world works (rationalists are overall better on this dimension, though they have other problems.)
[Note: I’m currently training a practice of noticing when what I say, or what others say, aligns with our personal [social] incentives. The statement above aligns with my incentives in so far as I like figuring things out, and apparently can do it. So if the statement above were true, that would imply that I am doing the “right thing” more than others who are doing other work.]
I’m curious to hear more detail about what you imagine for step 4. What sort of “nice things” do you have in mind? What kind of plans?
If EA focused more on feedback loops, then there’d be less focus on donating money to charity. How would you like these resources to be deployed instead?
At a very high level, a post-EA strategy might look something like this:
Invest in yourself, your friends, and your projects.
Become less confused about how the world works and what strategies are effective for doing things in it. Reading from a variety of fields (math, history, decision theory, economics, etc) is a good way to start, as is doing first-principles analyses and writing about them, and visiting places you think might be important to investigate.
Find friends you can talk to about this, and who you can coordinate with.
Make one or more plans for causing nice things to happen, which will probably involve environmentally sustainable positive feedback loops with positive externalities.
Execute on these plans until they becomes obsolete.
Not everyone has to do all these things, some can support people doing this, or be one member of a group that does this. Also, some people don’t care enough to actually do these things, and probably these people shouldn’t do strategy, and should instead do things they actually care about.
I basically agree with this approach. I have sometimes said that if I could change one thing about EA, it would be that I want more EAs to feel like there job is to understand the world and how the world works (rationalists are overall better on this dimension, though they have other problems.)
[Note: I’m currently training a practice of noticing when what I say, or what others say, aligns with our personal [social] incentives. The statement above aligns with my incentives in so far as I like figuring things out, and apparently can do it. So if the statement above were true, that would imply that I am doing the “right thing” more than others who are doing other work.]
I’m curious to hear more detail about what you imagine for step 4. What sort of “nice things” do you have in mind? What kind of plans?