You also can’t know if you’re in a simulation, a Big quantum world, a big cosmological world, or if you’re a reincarnation
But you can make estimates of the probabilities (EY’s estimate of the big quantum world part, for example, is very close to 1).
So really I just go with my gut and try to generally make decisions that I probably won’t think are stupid later given my current state of knowledge.
That just sounds pretty difficult, as my estimate of whether a decision is stupid or not may depend hugely on the assumptions I make about the world. In some cases, the decision that would be not-stupid in a big world scenario could be the complete opposite of what would make sense in a non-big world situation.
I meant the word “stupid” to carry a connotation of “obviously bad, obviously destroying value.”
Playing with my children rather than working extra hard to earn extra money to donate to MIRI will never be “stupid” although it may be in some sense the wrong choice if I end up being eaten by an AI.
This is true for the same reasons that putting money in my 401K is obviously “not stupid”, especially relative to giving that money to my brother-in-law who claims to have developed a new formula for weatherproofing roofs. Maybe my brother-in-law become a millionaire, but I’m still not going to feel like I made a stupid decision.
You may rightly point out that I’m not being rational and/or consistent. I seem to be valuing safe, near-term bets over risky, long-term bets, regardless of what the payouts of those bets might be. Part of my initial point is that, as an ape, I pretty much have to operate that way in most situations if I want to remain sane and effective. There are some people who get through life by making cold utilitarian calculations and acting on even the most counterintuitive conclusions, but the psychological cost of behaving that way has not been worth it to me.
But you can make estimates of the probabilities (EY’s estimate of the big quantum world part, for example, is very close to 1).
That just sounds pretty difficult, as my estimate of whether a decision is stupid or not may depend hugely on the assumptions I make about the world. In some cases, the decision that would be not-stupid in a big world scenario could be the complete opposite of what would make sense in a non-big world situation.
I meant the word “stupid” to carry a connotation of “obviously bad, obviously destroying value.”
Playing with my children rather than working extra hard to earn extra money to donate to MIRI will never be “stupid” although it may be in some sense the wrong choice if I end up being eaten by an AI.
This is true for the same reasons that putting money in my 401K is obviously “not stupid”, especially relative to giving that money to my brother-in-law who claims to have developed a new formula for weatherproofing roofs. Maybe my brother-in-law become a millionaire, but I’m still not going to feel like I made a stupid decision.
You may rightly point out that I’m not being rational and/or consistent. I seem to be valuing safe, near-term bets over risky, long-term bets, regardless of what the payouts of those bets might be. Part of my initial point is that, as an ape, I pretty much have to operate that way in most situations if I want to remain sane and effective. There are some people who get through life by making cold utilitarian calculations and acting on even the most counterintuitive conclusions, but the psychological cost of behaving that way has not been worth it to me.