I think what you’re doing is something that in psychology is called “Catastrophizing”. In essence you’re taking a mere unproven conjecture or possibility, exaggerating the negative severity of the implications, and then reacting emotionally as if this worst case scenario were true or significantly more likely than it actually is.
The proper protocol then is to re-familiarize yourself with Bayes Theorem (especially the concepts of evidence and priors), compartmentalize things according to their uncertainty, and try to step back and look at your actual beliefs and how they make you feel.
Rationality is more than just recognizing that something could be true, but also assigning appropriate degrees of belief to ideas that have a wide range of certainties and probability. What I am seeing repeatedly from your posts about the “dangers” of certain ideas, is that your assigning far too much fear to things which other people aren’t.
To use an overused quote: “Fear is the mindkiller.”
Try to look at the consequences of these ideas as dispassionately as possible. You cannot control everything that happens to you, but you can, to an extent, control your response to these circumstances.
For instance, with Dust Theory, you say that you gave it at most a 10% chance of being true, and it was paralyzing to you. This shouldn’t be. First, you need to consider your priors and the evidence. How often in the past have you had the actual experience that Dust Theory suggests is possible and which you fear? What actual experiential evidence do you have to suggest that Dust Theory is true?
For that matter, one of the common threads of your fears seems to be that “you” cease to exist and are replaced by a different “you” or that “you” die. But the truth is already the case that people are constantly changing. The “you” from 10 years ago will be made up of different atoms than the “you” 10 years from now by virtue of the fact that our cells are constantly dying and being replaced. The thoughts we have also change from moment to moment, and our brains adjust the strengths of the connections between neurons in order to learn, such that our past and future brains gradually diverge.
The only thing that really connects our past, present, and future selves is causality, in the sense that our past selves lead to our future selves when you follow the arrow of time. Therefore, what you -think- is a big deal, really isn’t.
This doesn’t mean you shouldn’t care about your future selves. In fact, in the same way that you should care about the experiences of all sentient beings because those experiences are real, you should care about the experiences of your future selves.
But don’t worry so much about things that you cannot control, like whether or not you’ll wake up tomorrow because of Dust Theory. I cannot see how worrying about this possibility will make it any more or less likely to occur. For all we know the sun could explode tomorrow. There is a non-zero possibility of that happening because we don’t know everything. But the probability of that happening, given our past experience with the sun, is very very very low, and as such behaving as if it will happen is completely irrational. Act according to what is MOST likely to happen, and what is MOST likely true, given the information you have right now. Maximize the Expected Utility. Expected is the key word here. Don’t make plans based on mere hopes or fears unless they are also expected. In statistics, expectation is commonly associated with the mean or average. Realistically, what will happen tomorrow will probably be a very average day.
For instance, with Dust Theory, you say that you gave it at most a 10% chance of being true, and it was paralyzing to you. This shouldn’t be. First, you need to consider your priors and the evidence.
No, I rated the death outcome as having a 10% chance of being true. But now I rate it much lower.
How often in the past have you had the actual experience that Dust Theory suggests is possible and which you fear? What actual experiential evidence do you have to suggest that Dust Theory is true?
This:
It will be some kind of natural selection in dust world lines, which will result in more stable ones, and most likely I am already in such line. In this line dreaming is built such that it will not result in important shifts of reality. And it is true: dreaming is not unconsciousness state. I start to have dreams immediately than I fall asleep. So dreaming is built to be not interupting some level of consciousness.
Basically, the fact that we do it only a little bit accounts for our observations in ways that other cosmological theories can’t.
For that matter, one of the common threads of your fears seems to be that “you” cease to exist and are replaced by a different “you” or that “you” die. But the truth is already the case that people are constantly changing. The “you” from 10 years ago will be made up of different atoms than the “you” 10 years from now by virtue of the fact that our cells are constantly dying and being replaced. The thoughts we have also change from moment to moment, and our brains adjust the strengths of the connections between neurons in order to learn, such that our past and future brains gradually diverge.
Er, you don’t understand the problem. I was worried about my subjective self dying.
I suspect part of the issue here is that your concept of subjective self isn’t constructed to be compatible with these kinds of thought experiments, or with the idea that reality may be forking and terminating all the time. I can say that because mine -is- compatible with such things, and as a result pretty much all of this category of problem doesn’t even show up on my radar.
Assuming I had a magical copying device that could copy my body at a sufficient accuracy, I could:
use the copier to create a copy of myself, and as the copy I could do the household chores then self destruct to free up resources without worrying about my ‘self’ dying.
use the copier to create a copy of myself, then as the original go do the chores/self destruct without worrying about my ‘self’ dying.
if there was a resource conflict which required the destruction of a copy, I could decide that I was the ‘least important’ copy and self terminate without worrying about my ‘self’ dying.
When a person’s sense of identity can do the above things, concerns about your dust scenario really don’t even show up as relevant—it doesn’t matter which timeline or state you end up in, so long as your self is active somewhere, you’re good.
use the copier to create a copy of myself, and as the copy I could do the household chores then self destruct to free up resources without worrying about my ‘self’ dying.
I wouldn’t do it in the first place, since there’s a fifty percent chance of me winding up doomed. But if the copy is already created than no, it would not be me dying.
use the copier to create a copy of myself, then as the original go do the chores/self destruct without worrying about my ‘self’ dying.
That is absolutely dying.
if there was a resource conflict which required the destruction of a copy, I could decide that I was the ‘least important’ copy and self terminate without worrying about my ‘self’ dying.
That’s what I figured. If anything, I’d say that this is your core issue, not dust theory. Your sense of subjective self just doesn’t map well onto what it’s actually possible to do, so of course you’re going to get garbage results from time to time.
I think what you’re doing is something that in psychology is called “Catastrophizing”. In essence you’re taking a mere unproven conjecture or possibility, exaggerating the negative severity of the implications, and then reacting emotionally as if this worst case scenario were true or significantly more likely than it actually is.
The proper protocol then is to re-familiarize yourself with Bayes Theorem (especially the concepts of evidence and priors), compartmentalize things according to their uncertainty, and try to step back and look at your actual beliefs and how they make you feel.
Rationality is more than just recognizing that something could be true, but also assigning appropriate degrees of belief to ideas that have a wide range of certainties and probability. What I am seeing repeatedly from your posts about the “dangers” of certain ideas, is that your assigning far too much fear to things which other people aren’t.
To use an overused quote: “Fear is the mindkiller.”
Try to look at the consequences of these ideas as dispassionately as possible. You cannot control everything that happens to you, but you can, to an extent, control your response to these circumstances.
For instance, with Dust Theory, you say that you gave it at most a 10% chance of being true, and it was paralyzing to you. This shouldn’t be. First, you need to consider your priors and the evidence. How often in the past have you had the actual experience that Dust Theory suggests is possible and which you fear? What actual experiential evidence do you have to suggest that Dust Theory is true?
For that matter, one of the common threads of your fears seems to be that “you” cease to exist and are replaced by a different “you” or that “you” die. But the truth is already the case that people are constantly changing. The “you” from 10 years ago will be made up of different atoms than the “you” 10 years from now by virtue of the fact that our cells are constantly dying and being replaced. The thoughts we have also change from moment to moment, and our brains adjust the strengths of the connections between neurons in order to learn, such that our past and future brains gradually diverge.
The only thing that really connects our past, present, and future selves is causality, in the sense that our past selves lead to our future selves when you follow the arrow of time. Therefore, what you -think- is a big deal, really isn’t.
This doesn’t mean you shouldn’t care about your future selves. In fact, in the same way that you should care about the experiences of all sentient beings because those experiences are real, you should care about the experiences of your future selves.
But don’t worry so much about things that you cannot control, like whether or not you’ll wake up tomorrow because of Dust Theory. I cannot see how worrying about this possibility will make it any more or less likely to occur. For all we know the sun could explode tomorrow. There is a non-zero possibility of that happening because we don’t know everything. But the probability of that happening, given our past experience with the sun, is very very very low, and as such behaving as if it will happen is completely irrational. Act according to what is MOST likely to happen, and what is MOST likely true, given the information you have right now. Maximize the Expected Utility. Expected is the key word here. Don’t make plans based on mere hopes or fears unless they are also expected. In statistics, expectation is commonly associated with the mean or average. Realistically, what will happen tomorrow will probably be a very average day.
That is being rational.
Hope that helps!
No, I rated the death outcome as having a 10% chance of being true. But now I rate it much lower.
This:
Basically, the fact that we do it only a little bit accounts for our observations in ways that other cosmological theories can’t.
Er, you don’t understand the problem. I was worried about my subjective self dying.
I suspect part of the issue here is that your concept of subjective self isn’t constructed to be compatible with these kinds of thought experiments, or with the idea that reality may be forking and terminating all the time. I can say that because mine -is- compatible with such things, and as a result pretty much all of this category of problem doesn’t even show up on my radar.
Assuming I had a magical copying device that could copy my body at a sufficient accuracy, I could:
use the copier to create a copy of myself, and as the copy I could do the household chores then self destruct to free up resources without worrying about my ‘self’ dying.
use the copier to create a copy of myself, then as the original go do the chores/self destruct without worrying about my ‘self’ dying.
if there was a resource conflict which required the destruction of a copy, I could decide that I was the ‘least important’ copy and self terminate without worrying about my ‘self’ dying.
When a person’s sense of identity can do the above things, concerns about your dust scenario really don’t even show up as relevant—it doesn’t matter which timeline or state you end up in, so long as your self is active somewhere, you’re good.
How would you treat the above situations?
I wouldn’t do it in the first place, since there’s a fifty percent chance of me winding up doomed. But if the copy is already created than no, it would not be me dying.
That is absolutely dying.
Same thing for this.
That’s what I figured. If anything, I’d say that this is your core issue, not dust theory. Your sense of subjective self just doesn’t map well onto what it’s actually possible to do, so of course you’re going to get garbage results from time to time.
I guess I don’t understand then? Care to explain what your “subjective self” actually is?