I do indeed think orienting is an underappreciated skill (I have a blogpost in the works someday about orienting being a key skill for the 21st century because stuff is gonna just Keep On Happening faster than we are used to responding to)
Some specific times I oriented:
Covid
Notably, I think oriented once at the beginning of covid, and then failed to orient a couple more times when it was appropriate, i.e. 3 months into covid and again a year later. But I think I learned lessons from that which were applicable to...
During the Ukraine/Russia nuke scare
Lightcone did some nuclear prep stuff, and, notably as a “lesson learned from covid”, also thought more about how much would be “too much” and what timeboxing to set on the endeavor
When FTX was becoming a prominent funder
Me individually and Lightcone as a whole thought about how much money to pay people and how to change our spending habits
Lightcone team considered moving to the Bahamas. We chose not to, which we’re pretty glad about because…
Again, when FTX collapsed
“Man what life lessons is EA as a whole supposed to learn from this?”
When the world started responding to ChatGPT
Since ChatGPT, AI things have been moving quite quickly and I’ve been doing an orient step every couple weeks (in response to things like the TIME piece, various Open Letters, White House actions, etc)
This has all mostly come in the form of considering new plans (i.e. changing my “deciding” step but not my “action” step, so far)
I think actually the observation step is also underappreciated. I think a common first-level mistake is “forgetting to orient”, and a separate common mistake is that observation is actually separate from orienting and people tend to blur the two together.
Those kind of sound like decisions. Is the difference that you paused a little longer and sort of organized your thoughts beyond what was immediately necessary? Or how would you describe the key differentiating thing here?
In each case, I was sitting around doing some unrelated thing, and then noticed “hey, an observation (covid, ukraine, ftx), seems maybe decision relevant. Let’s think about it and see if changes the space of decisions I might want to make”
In of these cases, if I hadn’t oriented, I would have been doing stuff completely unrelated to covid/ukraine/ai-regulation” (which in my case usually looked like “building LW features and/or Lightcone offices, and having some nice hobbies).
The thing that’s maybe a bit more confusing is the ChatGPT/Open-Letters thing, where I had been working on AI risk mitigation, but I’d been focused on one particular strategic frame (i.e. building infrastructure for AI researchers), because I’d previously judged “help the world-at-large orient to x-risk” wasn’t very tractable. But I’d cached the strategic-thought of “it’s too intractable to help the world orient to AI risk” and not re-examined it, and it took me ~a month or two after chatGPT came out to realize that it was changing that landscape and that I should open up a whole set of possible decisions I’d previously pruned off.
I do indeed think orienting is an underappreciated skill (I have a blogpost in the works someday about orienting being a key skill for the 21st century because stuff is gonna just Keep On Happening faster than we are used to responding to)
Some specific times I oriented:
Covid
Notably, I think oriented once at the beginning of covid, and then failed to orient a couple more times when it was appropriate, i.e. 3 months into covid and again a year later. But I think I learned lessons from that which were applicable to...
During the Ukraine/Russia nuke scare
Lightcone did some nuclear prep stuff, and, notably as a “lesson learned from covid”, also thought more about how much would be “too much” and what timeboxing to set on the endeavor
When FTX was becoming a prominent funder
Me individually and Lightcone as a whole thought about how much money to pay people and how to change our spending habits
Lightcone team considered moving to the Bahamas. We chose not to, which we’re pretty glad about because…
Again, when FTX collapsed
“Man what life lessons is EA as a whole supposed to learn from this?”
When the world started responding to ChatGPT
Since ChatGPT, AI things have been moving quite quickly and I’ve been doing an orient step every couple weeks (in response to things like the TIME piece, various Open Letters, White House actions, etc)
This has all mostly come in the form of considering new plans (i.e. changing my “deciding” step but not my “action” step, so far)
I think actually the observation step is also underappreciated. I think a common first-level mistake is “forgetting to orient”, and a separate common mistake is that observation is actually separate from orienting and people tend to blur the two together.
Those kind of sound like decisions. Is the difference that you paused a little longer and sort of organized your thoughts beyond what was immediately necessary? Or how would you describe the key differentiating thing here?
In each case, I was sitting around doing some unrelated thing, and then noticed “hey, an observation (covid, ukraine, ftx), seems maybe decision relevant. Let’s think about it and see if changes the space of decisions I might want to make”
In of these cases, if I hadn’t oriented, I would have been doing stuff completely unrelated to covid/ukraine/ai-regulation” (which in my case usually looked like “building LW features and/or Lightcone offices, and having some nice hobbies).
The thing that’s maybe a bit more confusing is the ChatGPT/Open-Letters thing, where I had been working on AI risk mitigation, but I’d been focused on one particular strategic frame (i.e. building infrastructure for AI researchers), because I’d previously judged “help the world-at-large orient to x-risk” wasn’t very tractable. But I’d cached the strategic-thought of “it’s too intractable to help the world orient to AI risk” and not re-examined it, and it took me ~a month or two after chatGPT came out to realize that it was changing that landscape and that I should open up a whole set of possible decisions I’d previously pruned off.