Hi, I think the reason why people like me freak out about things like this is because we tend to accept new ideas quite quickly (e.g. if someone showed me actual proof god is real I would abandon my 7 years of atheism in a heartbeat and become a priest) so it’s quite emotionally salient for me to imagine things like this. And simply saying “You’re worrying too much, find something else to do to take your mind off of things like this” doesn’t really help since it’s like saying to a depressed person “Just be happy, it’s all in your head.”
I think the better comparison with the depressed person is the depressed person saying “Life sucks because X”, and their friend tries to disprove X, but ultimately the person is still depressed and it wasn’t really about X in particular.
I have on my todo list to write (or have someone write) a post that’s trying to spell out why/how to chill out about this. Unfortunately it’s a fair amount of work, and I don’t expect whatever quick reason I give you to especially help.
I do generally think “Be careful about taking ideas seriously. It’s a virtue to be ready to take ideas seriously, but the general equilibrium where most people don’t take ideas too seriously was a fairly important memetic defense. I.e. most people believe in God, but they also don’t take it too seriously. The people who do take it seriously do a lot of damage. It’s dangerous to be half-a-rationalist. etc.”
I think one relevant insight is that you should weight the experience of your various multiverse-selves by their measure, and the fact that a teeny sliver of reality has some random thing happening to you isn’t very relevant.
Glad to hear you’re planning to write up a post covering stuff like this! I personally think it’s quite overdue, especially on a site like this which I suspect has an inherent selection effect on people who take ideas quite seriously like me. I don’t quite understand the last part of your reply though, I understand the importance of measure in decision making but like I said in my post, I thought if the blackmailer makes a significant number of simulations then indexical uncertainty could still be established since it could still have a significant effect on your future observer moments. Did I make a mistake anywhere in my reasoning?
My suggestion is to first make sure that your reasoning is sane. Free from sub-conscious effects leaking into it. Leaking-in meaning worrying interpretations being more salient or less rigorous reasoning in areas where feelings play a role.
Hi, I think the reason why people like me freak out about things like this is because we tend to accept new ideas quite quickly (e.g. if someone showed me actual proof god is real I would abandon my 7 years of atheism in a heartbeat and become a priest) so it’s quite emotionally salient for me to imagine things like this. And simply saying “You’re worrying too much, find something else to do to take your mind off of things like this” doesn’t really help since it’s like saying to a depressed person “Just be happy, it’s all in your head.”
I think the better comparison with the depressed person is the depressed person saying “Life sucks because X”, and their friend tries to disprove X, but ultimately the person is still depressed and it wasn’t really about X in particular.
I have on my todo list to write (or have someone write) a post that’s trying to spell out why/how to chill out about this. Unfortunately it’s a fair amount of work, and I don’t expect whatever quick reason I give you to especially help.
I do generally think “Be careful about taking ideas seriously. It’s a virtue to be ready to take ideas seriously, but the general equilibrium where most people don’t take ideas too seriously was a fairly important memetic defense. I.e. most people believe in God, but they also don’t take it too seriously. The people who do take it seriously do a lot of damage. It’s dangerous to be half-a-rationalist. etc.”
I think one relevant insight is that you should weight the experience of your various multiverse-selves by their measure, and the fact that a teeny sliver of reality has some random thing happening to you isn’t very relevant.
Glad to hear you’re planning to write up a post covering stuff like this! I personally think it’s quite overdue, especially on a site like this which I suspect has an inherent selection effect on people who take ideas quite seriously like me. I don’t quite understand the last part of your reply though, I understand the importance of measure in decision making but like I said in my post, I thought if the blackmailer makes a significant number of simulations then indexical uncertainty could still be established since it could still have a significant effect on your future observer moments. Did I make a mistake anywhere in my reasoning?
My suggestion is to first make sure that your reasoning is sane. Free from sub-conscious effects leaking into it. Leaking-in meaning worrying interpretations being more salient or less rigorous reasoning in areas where feelings play a role.
See The Treacherous Path to Rationality for some more aspects. You should be on stable footing before you approach the monsters.