Goals. What do you want to get out of this? What do you really want? Doing lots of things quickly and intensely is one way to describe a rat race. Yes, you can run faster but where are you running to?
Good question. This thing has a lot of opportunity for lost purposes. My particular goal is get enough power to save the world, which does sometimes disagree with maximum vampire-mode. They often agree though, and the vampire heuristic pushes harder than the save-the-world goal, because vanity metrics like “how vampire-mode are you” are more motivating than important things like saving the world.
Risks. There is a certain aura of invincibility surrounding this post.
Another good point. Let’s not be stupid and get killed. I for one would think that it is probably not a good idea to go fight in a revolution or other dangerous activities, but I put it in anyway because it was a good example.
My particular goal is get enough power to save the world, which does sometimes disagree with maximum vampire-mode.
Maybe instead of thinking in 1000-year-old vampire terms for this, it’s better to think in intelligence explosion terms. For example, let’s say you have a list of goals you want to accomplish. One is get more sleep, which you expect, among other things, to improve your focus. Another is meditate intensely for 10 minutes every day, which you expect, among other things, to improve your willpower. Each of these goals helps to some degree with the other. If you had a lot of goals that all improved your capacity to work towards the other goals, and improved your capacity in general, then you could potentially see exponential growth in your ability to do things. (Some of your goals should probably be object-level though, because it’s good to intermix self-improvement goals with object-level goals to see if you are actually building the kind of capacities you need.)
But lets say that you’ve tried and failed at both the meditation and sleep goals in the past. In that case, you are having a hard time getting the exponential growth cycle started, and you’re probably better off taking things from another angle, kind of like a game of sudoku. So develop and test a hypothesis about why your meditation goal failed, or read up on strategies people have to overcome whatever problem you think you were having. Or alternatively, try to find the small capacity-building thing that you think is quite likely to stick for the long term, and try to achieve it so you can gain a toehold. Then celebrate and attempt the capacity-building thing that you think is the next hardest to get to stick, etc. This also doubles as establishing a success spiral and builds the ability to maintain commitments to yourself (and is also kaizen).
Probably the easiest sort of “toehold” to establish is just to start learning more about yourself. Read up on ugh fields, reinforcement, and stuff like it. Maybe spill your guts to a friend, or start keeping a productivity diary… every entry you make has a small expected self-knowledge gain. Basically try to get as much insight as you can in to yourself, because self-knowledge is an irreversible capacity gain (unlike, say, a habit, which, once lost, will have to be re-established).
(I’m sure I’m not the only one who’s wondered if the effective altruist community is best off overtly telling everyone we are focused on things like Givewell, MIRI, etc. but covertly choosing to focus most of our resources on capacity-building, knowing that the goals we’ve set for ourselves are big enough that they aren’t going to be realistically accomplished with our current cohort.)
I’m sure I’m not the only one who’s wondered if the effective altruist community is best off overtly telling everyone we are focused on things like Givewell, MIRI, etc. but covertly choosing to focus most of our resources on capacity-building, knowing that the goals we’ve set for ourselves are big enough that they aren’t going to be realistically accomplished with our current cohort.
I don’t think deception is even necessary here. Leverage Research is basically openly telling people they’re focused on capacity building and making humans smarter & more capable, and CFAR seems to be doing the same thing from a different angle.
I’m advocating deception in the sense of not telling people that a possible majority of the resources in the EA movement are devoted to expanding the EA movement, at least not right off. I think it’s fine that Leverage and CFAR are upfront about their mission.
Let us recall the maxim about people advocating big lies on the internet, the suitability of said people for propagating said lies, and their likelihood of success.
Good question. This thing has a lot of opportunity for lost purposes. My particular goal is get enough power to save the world, which does sometimes disagree with maximum vampire-mode. They often agree though, and the vampire heuristic pushes harder than the save-the-world goal, because vanity metrics like “how vampire-mode are you” are more motivating than important things like saving the world.
Another good point. Let’s not be stupid and get killed. I for one would think that it is probably not a good idea to go fight in a revolution or other dangerous activities, but I put it in anyway because it was a good example.
Maybe instead of thinking in 1000-year-old vampire terms for this, it’s better to think in intelligence explosion terms. For example, let’s say you have a list of goals you want to accomplish. One is get more sleep, which you expect, among other things, to improve your focus. Another is meditate intensely for 10 minutes every day, which you expect, among other things, to improve your willpower. Each of these goals helps to some degree with the other. If you had a lot of goals that all improved your capacity to work towards the other goals, and improved your capacity in general, then you could potentially see exponential growth in your ability to do things. (Some of your goals should probably be object-level though, because it’s good to intermix self-improvement goals with object-level goals to see if you are actually building the kind of capacities you need.)
But lets say that you’ve tried and failed at both the meditation and sleep goals in the past. In that case, you are having a hard time getting the exponential growth cycle started, and you’re probably better off taking things from another angle, kind of like a game of sudoku. So develop and test a hypothesis about why your meditation goal failed, or read up on strategies people have to overcome whatever problem you think you were having. Or alternatively, try to find the small capacity-building thing that you think is quite likely to stick for the long term, and try to achieve it so you can gain a toehold. Then celebrate and attempt the capacity-building thing that you think is the next hardest to get to stick, etc. This also doubles as establishing a success spiral and builds the ability to maintain commitments to yourself (and is also kaizen).
Probably the easiest sort of “toehold” to establish is just to start learning more about yourself. Read up on ugh fields, reinforcement, and stuff like it. Maybe spill your guts to a friend, or start keeping a productivity diary… every entry you make has a small expected self-knowledge gain. Basically try to get as much insight as you can in to yourself, because self-knowledge is an irreversible capacity gain (unlike, say, a habit, which, once lost, will have to be re-established).
(I’m sure I’m not the only one who’s wondered if the effective altruist community is best off overtly telling everyone we are focused on things like Givewell, MIRI, etc. but covertly choosing to focus most of our resources on capacity-building, knowing that the goals we’ve set for ourselves are big enough that they aren’t going to be realistically accomplished with our current cohort.)
I don’t think deception is even necessary here. Leverage Research is basically openly telling people they’re focused on capacity building and making humans smarter & more capable, and CFAR seems to be doing the same thing from a different angle.
I’m advocating deception in the sense of not telling people that a possible majority of the resources in the EA movement are devoted to expanding the EA movement, at least not right off. I think it’s fine that Leverage and CFAR are upfront about their mission.
Let us recall the maxim about people advocating big lies on the internet, the suitability of said people for propagating said lies, and their likelihood of success.