The part about not being excited about anything sounds very accurate and is certainly a part of the problem. I’ve also tried just taking up projects and focusing on them, but I should probably try harder as well.
However, a big part of the problem is that it’s not just that those things feel insignificant; it’s also that I have a vague feeling that I’m sort of putting my own well-being in jeopardy by doing that. As I said, I’m very confused about things like life, death and existence, on a personal level. How do I focus on mundane things when I’m confused about basic things such as whether I (or anyone) else should expect to eventually die or to experience a weird-ass form of subjective anthropic immortality, and about what that actually means? Should that make me act somehow?
If there is One Weird Trick that you should using right now in order to game your way around anthropics, simulationism, or deontology, you don’t know what that trick is, you won’t figure out what that trick is, and it’s somewhat likely that you can’t figure out what that trick is because if you did you would get hammered down by the acausal math/simulators/gods.
You also can’t know if you’re in a simulation, a Big quantum world, a big cosmological world, or if you’re a reincarnation. Or one or more of those at the same time. And each of those realities would imply a different thing that you should be doing to optimize your … whatever it is you should be optimizing. Which you also don’t know.
So really I just go with my gut and try to generally make decisions that I probably won’t think are stupid later given my current state of knowledge.
You also can’t know if you’re in a simulation, a Big quantum world, a big cosmological world, or if you’re a reincarnation
But you can make estimates of the probabilities (EY’s estimate of the big quantum world part, for example, is very close to 1).
So really I just go with my gut and try to generally make decisions that I probably won’t think are stupid later given my current state of knowledge.
That just sounds pretty difficult, as my estimate of whether a decision is stupid or not may depend hugely on the assumptions I make about the world. In some cases, the decision that would be not-stupid in a big world scenario could be the complete opposite of what would make sense in a non-big world situation.
I meant the word “stupid” to carry a connotation of “obviously bad, obviously destroying value.”
Playing with my children rather than working extra hard to earn extra money to donate to MIRI will never be “stupid” although it may be in some sense the wrong choice if I end up being eaten by an AI.
This is true for the same reasons that putting money in my 401K is obviously “not stupid”, especially relative to giving that money to my brother-in-law who claims to have developed a new formula for weatherproofing roofs. Maybe my brother-in-law become a millionaire, but I’m still not going to feel like I made a stupid decision.
You may rightly point out that I’m not being rational and/or consistent. I seem to be valuing safe, near-term bets over risky, long-term bets, regardless of what the payouts of those bets might be. Part of my initial point is that, as an ape, I pretty much have to operate that way in most situations if I want to remain sane and effective. There are some people who get through life by making cold utilitarian calculations and acting on even the most counterintuitive conclusions, but the psychological cost of behaving that way has not been worth it to me.
The part about not being excited about anything sounds very accurate and is certainly a part of the problem. I’ve also tried just taking up projects and focusing on them, but I should probably try harder as well.
However, a big part of the problem is that it’s not just that those things feel insignificant; it’s also that I have a vague feeling that I’m sort of putting my own well-being in jeopardy by doing that. As I said, I’m very confused about things like life, death and existence, on a personal level. How do I focus on mundane things when I’m confused about basic things such as whether I (or anyone) else should expect to eventually die or to experience a weird-ass form of subjective anthropic immortality, and about what that actually means? Should that make me act somehow?
If there is One Weird Trick that you should using right now in order to game your way around anthropics, simulationism, or deontology, you don’t know what that trick is, you won’t figure out what that trick is, and it’s somewhat likely that you can’t figure out what that trick is because if you did you would get hammered down by the acausal math/simulators/gods.
You also can’t know if you’re in a simulation, a Big quantum world, a big cosmological world, or if you’re a reincarnation. Or one or more of those at the same time. And each of those realities would imply a different thing that you should be doing to optimize your … whatever it is you should be optimizing. Which you also don’t know.
So really I just go with my gut and try to generally make decisions that I probably won’t think are stupid later given my current state of knowledge.
But you can make estimates of the probabilities (EY’s estimate of the big quantum world part, for example, is very close to 1).
That just sounds pretty difficult, as my estimate of whether a decision is stupid or not may depend hugely on the assumptions I make about the world. In some cases, the decision that would be not-stupid in a big world scenario could be the complete opposite of what would make sense in a non-big world situation.
I meant the word “stupid” to carry a connotation of “obviously bad, obviously destroying value.”
Playing with my children rather than working extra hard to earn extra money to donate to MIRI will never be “stupid” although it may be in some sense the wrong choice if I end up being eaten by an AI.
This is true for the same reasons that putting money in my 401K is obviously “not stupid”, especially relative to giving that money to my brother-in-law who claims to have developed a new formula for weatherproofing roofs. Maybe my brother-in-law become a millionaire, but I’m still not going to feel like I made a stupid decision.
You may rightly point out that I’m not being rational and/or consistent. I seem to be valuing safe, near-term bets over risky, long-term bets, regardless of what the payouts of those bets might be. Part of my initial point is that, as an ape, I pretty much have to operate that way in most situations if I want to remain sane and effective. There are some people who get through life by making cold utilitarian calculations and acting on even the most counterintuitive conclusions, but the psychological cost of behaving that way has not been worth it to me.