So you’re saying if stop believing in stock market crashes, they go away?
I think what you mean is that if you intervened to change everyone’s beliefs away from “oh shit, sell!”, then stock market crashes would not happen. That is a different matter than talking about just my or your belief.
So you’re saying if stop believing in stock market crashes, they go away?
More often it works the other way around: the fact that someone stops believing in an overinflated stock market (i.e. claims a “bubble” is about to burst) acts as a self-fulfilling prophecy, causing others to also stop believing which -if this information cascade propagates enough- will cause a crash, therefore bringing reality in line with the original belief.
But information cascades can also cause booms, as I understand it more likely of individual stocks.
The “someone” above is underspecified: it can be one particularly influential person—Nate Silver recounts how Amazon stock surged 25% after Henry Blodget hyped it up in 1998. But it can also be a larger group, who, looking at small fluctuations in the market, panic and start a stampede.
My point is that “thought bubbles” in general are part of reality. Your believing in things has causal influence on reality (another concrete example: romantic relationships—the concept “love”, which can be cashed out in terms of blood levels of various hormones, is one of those things that go away because people stop believing in it). It is generally bad epistemic practice to overstate this influence, but it can also be bad to understate it.
So you’re saying if stop believing in stock market crashes, they go away?
I think what you mean is that if you intervened to change everyone’s beliefs away from “oh shit, sell!”, then stock market crashes would not happen. That is a different matter than talking about just my or your belief.
More often it works the other way around: the fact that someone stops believing in an overinflated stock market (i.e. claims a “bubble” is about to burst) acts as a self-fulfilling prophecy, causing others to also stop believing which -if this information cascade propagates enough- will cause a crash, therefore bringing reality in line with the original belief.
But information cascades can also cause booms, as I understand it more likely of individual stocks.
The “someone” above is underspecified: it can be one particularly influential person—Nate Silver recounts how Amazon stock surged 25% after Henry Blodget hyped it up in 1998. But it can also be a larger group, who, looking at small fluctuations in the market, panic and start a stampede.
My point is that “thought bubbles” in general are part of reality. Your believing in things has causal influence on reality (another concrete example: romantic relationships—the concept “love”, which can be cashed out in terms of blood levels of various hormones, is one of those things that go away because people stop believing in it). It is generally bad epistemic practice to overstate this influence, but it can also be bad to understate it.
Agreed.
My point was that your examples were a part of reality in a way that the ideal belief-of-observer used in the “reality is that which...” mantra isn’t.