This quote from Moral Mazes seems relevant to this earlier discussion, and may provide further understanding for why markets were slow to respond to the pandemic (emphasis mine):
115. This explains why the chemical company managers kept putting off a decision about major reinvestment. After the battery collapsed in 1979, however, the decision facing them was simple and posed little risk. The corporation had to meet its legal obligations; also, it had either to repair the battery the way the EPA demanded or shut down the plant and lose several hundred million dollars. Since there were no real choices, everyone could agree on a course of action because everyone could appeal to inevitability. This is the nub of managerial decision making. As one manager says: Decisions are made only when they are inevitable. To make a decision ahead of the time it has to be made risks political catastrophe. People can always interpret the decision as an unwise one even if it seems to be correct on other grounds. (Location 1886)
In Feb/March, if the relevant financial institutions were going through such a behind-the-scenes process of “establishing the inevitability” of the pandemic before large market-moving decisions could be made, this could explain the apparent delay (and corresponding opportunity for the rational individual investor). One can imagine individuals within these firms feeling each other out—“This pandemic might turn into a big deal, huh?” “Yeah, but the boss hasn’t seemed too concerned yet, let’s give it another few days before we bring it up again”—before the consensus grew large enough where the decision became inevitable.
If this model is accurate, when would we expect to see these kinds of delays (and opportunities) in other situations? Here are some factors that may have contributed:
The early pandemic required integrating a lot of information outside the core areas of expertise of firms and their traders, leading to more uncertainty and a longer delay to reach consensus.
People are bad at extrapolating exponential growth (citation needed), and while some individuals within firms may have realized the implications right away, others may have thought their concerns were way overblown, again prolonging the time to reach consensus.
This was a rare event that had not occurred within anyone’s living memory, so there was no good frame of reference to fall back on, also increasing uncertainty.
I feel like there’s something here worth investigating more closely, although I’m still having trouble understanding it as well as I would like to. For now I’ll note that these three factors also seem very applicable to the current state of AGI development, and so may tie in with previous discussions such as this one.
This quote from Moral Mazes seems relevant to this earlier discussion, and may provide further understanding for why markets were slow to respond to the pandemic (emphasis mine):
In Feb/March, if the relevant financial institutions were going through such a behind-the-scenes process of “establishing the inevitability” of the pandemic before large market-moving decisions could be made, this could explain the apparent delay (and corresponding opportunity for the rational individual investor). One can imagine individuals within these firms feeling each other out—“This pandemic might turn into a big deal, huh?” “Yeah, but the boss hasn’t seemed too concerned yet, let’s give it another few days before we bring it up again”—before the consensus grew large enough where the decision became inevitable.
If this model is accurate, when would we expect to see these kinds of delays (and opportunities) in other situations? Here are some factors that may have contributed:
The early pandemic required integrating a lot of information outside the core areas of expertise of firms and their traders, leading to more uncertainty and a longer delay to reach consensus.
People are bad at extrapolating exponential growth (citation needed), and while some individuals within firms may have realized the implications right away, others may have thought their concerns were way overblown, again prolonging the time to reach consensus.
This was a rare event that had not occurred within anyone’s living memory, so there was no good frame of reference to fall back on, also increasing uncertainty.
I feel like there’s something here worth investigating more closely, although I’m still having trouble understanding it as well as I would like to. For now I’ll note that these three factors also seem very applicable to the current state of AGI development, and so may tie in with previous discussions such as this one.