you don’t just care about one particular possible world where things happen to turn out exactly the way you want.
Presumably there are infinitely many possible worlds where things happen to turn out exactly the way I want: I care about some small finite subset of the world, and the rest is allowed to vary. Why should I expend energy worrying about one particular infinity of worlds that are hard to optimize when I have already got infinitely many where I win easily or by default?
There are presumably also infinitely many possible worlds where all varieties of bizarre decision/action algorithms are the way to win. For example, the world where the extent to which your preferences get satisfied is determined by what fraction of your skin is covered in red body paint, etc, etc.
Also, there are other classes of worlds where I lose: for example, anti-inductive worlds. Why should I pay special attention to the worlds that loosely obey the occam/complexity prior?
Perhaps I could frame it this way: the complexity prior is (in fact) counterintuitive and alien to the human mind. Why should I pay special attention to worlds that conform to it (simple worlds)?
The answer I used to have was “because it works”, which seemed to cache out as
“if I use a complexity prior to repeatedly make decisions, then my subjective experience will be (mostly) of winning”
which I used to think was because the Real world that we live in is, in fact, a simple one, rather than a wishful-thinking one, a red-body-paint one, or an anti-inductive one.
It sounds like you’re assuming that people use a wishful-thinking prior by default, and have to be argued into a complexity-based prior. This seems implausible to me.
I think the phenomenon of wishful thinking doesn’t come from one’s prior, but from evolution being too stupid to design a rational decision process. That is, a part of my brain rewards me for increasing the anticipation of positive future experiences, even if that increase is caused by faulty reasoning instead of good decisions. This causes me to engage in wishful thinking (i.e., miscalculating the implications of my prior) in order to increase my reward.
Perhaps I could frame it this way: the complexity prior is (in fact) counterintuitive and alien to the human mind.
I dispute this. Sure, some of the implications of the complexity prior are counterintuitive, but it would be surprising if none of them were. I mean, some theorems of number theory are counterintuitive, but that doesn’t mean integers are aliens to the human mind.
Why should I pay special attention to worlds that conform to it (simple worlds)?
Suppose someone gave you a water-tight argument that all possible world are in fact real, and you have to make decisions based on which worlds you care more about. Would you really adopt the “wishful-thinking” prior and start putting all your money into lottery tickets or something similar, or would your behavior be more or less unaffected? If it’s the latter, don’t you already care more about worlds that are simple?
“if I use a complexity prior to repeatedly make decisions, then my subjective experience will be (mostly) of winning”
Perhaps this is just one of the ways an algorithm that cares about each world in proportion to its inverse complexity could feel from the inside?
“if I use a complexity prior to repeatedly make decisions, then my subjective experience will be (mostly) of winning”—Perhaps this is just one of the ways an algorithm that cares about each world in proportion to its inverse complexity could feel from the inside?
this is a good point, I’ll have to think about it.
Suppose someone gave you a water-tight argument that all possible world are in fact real, and you have to make decisions based on which worlds you care more about. Would you really adopt the “wishful-thinking” prior and start putting all your money into lottery tickets or something similar, or would your behavior be more or less unaffected?
I think that there would be a question about what “I” would actually experience.
There have been times in my younger days when I tried a bit of wishful thinking—I think everyone has. Maybe, just maybe, if I wish hard enough for X, X will happen? Well what you actually experience after doing that is … failure. Wishing for something doesn’t make it happen—or if it does in some worlds, then I have evidence that I don’t inhabit those worlds.
So I suppose I am using my memory—which points to me having always been in a world that behaves exactly as the complexity prior would predict—as evidence that the thread of my subjective experience will always be in a world that behaves as the complexity prior would predict, which is sort of like saying that only one particular simple world is real.
You don’t believe in affirmations? The self-help books about the power of positive thinking don’t work for you? What do you make of the following quote?
“Personal optimism correlates strongly with self-esteem, with psychological well-being and with physical and mental health. Optimism has been shown to be correlated with better immune systems in healthy people who have been subjected to stress.”
When crafting ones wishes, one should have at least some minor element of realism.
Also, your wish should be something your subconscious can help you with. For example, instead of wishfully thinking about money appearing in your bank account, you could wishfully think about finding it on the sidewalk. Or, alternatively you could wishfully think about yourself as a money magnet.
If you previously did not bear such points in mind, you might want to consider revisiting the technique, to see if you can make something of it. Unless you figure you are already too optimistic, that is.
Presumably there are infinitely many possible worlds where things happen to turn out exactly the way I want: I care about some small finite subset of the world, and the rest is allowed to vary. Why should I expend energy worrying about one particular infinity of worlds that are hard to optimize when I have already got infinitely many where I win easily or by default?
There are presumably also infinitely many possible worlds where all varieties of bizarre decision/action algorithms are the way to win. For example, the world where the extent to which your preferences get satisfied is determined by what fraction of your skin is covered in red body paint, etc, etc.
Also, there are other classes of worlds where I lose: for example, anti-inductive worlds. Why should I pay special attention to the worlds that loosely obey the occam/complexity prior?
Perhaps I could frame it this way: the complexity prior is (in fact) counterintuitive and alien to the human mind. Why should I pay special attention to worlds that conform to it (simple worlds)?
The answer I used to have was “because it works”, which seemed to cache out as
“if I use a complexity prior to repeatedly make decisions, then my subjective experience will be (mostly) of winning”
which I used to think was because the Real world that we live in is, in fact, a simple one, rather than a wishful-thinking one, a red-body-paint one, or an anti-inductive one.
It sounds like you’re assuming that people use a wishful-thinking prior by default, and have to be argued into a complexity-based prior. This seems implausible to me.
I think the phenomenon of wishful thinking doesn’t come from one’s prior, but from evolution being too stupid to design a rational decision process. That is, a part of my brain rewards me for increasing the anticipation of positive future experiences, even if that increase is caused by faulty reasoning instead of good decisions. This causes me to engage in wishful thinking (i.e., miscalculating the implications of my prior) in order to increase my reward.
I dispute this. Sure, some of the implications of the complexity prior are counterintuitive, but it would be surprising if none of them were. I mean, some theorems of number theory are counterintuitive, but that doesn’t mean integers are aliens to the human mind.
Suppose someone gave you a water-tight argument that all possible world are in fact real, and you have to make decisions based on which worlds you care more about. Would you really adopt the “wishful-thinking” prior and start putting all your money into lottery tickets or something similar, or would your behavior be more or less unaffected? If it’s the latter, don’t you already care more about worlds that are simple?
Perhaps this is just one of the ways an algorithm that cares about each world in proportion to its inverse complexity could feel from the inside?
this is a good point, I’ll have to think about it.
I think that there would be a question about what “I” would actually experience.
There have been times in my younger days when I tried a bit of wishful thinking—I think everyone has. Maybe, just maybe, if I wish hard enough for X, X will happen? Well what you actually experience after doing that is … failure. Wishing for something doesn’t make it happen—or if it does in some worlds, then I have evidence that I don’t inhabit those worlds.
So I suppose I am using my memory—which points to me having always been in a world that behaves exactly as the complexity prior would predict—as evidence that the thread of my subjective experience will always be in a world that behaves as the complexity prior would predict, which is sort of like saying that only one particular simple world is real.
You don’t believe in affirmations? The self-help books about the power of positive thinking don’t work for you? What do you make of the following quote?
“Personal optimism correlates strongly with self-esteem, with psychological well-being and with physical and mental health. Optimism has been shown to be correlated with better immune systems in healthy people who have been subjected to stress.”
http://en.wikipedia.org/wiki/Optimism
This is not the kind of wishful thinking I was talking about: I was talking about wishing for $1000 and it just appearing in your bank account.
When crafting ones wishes, one should have at least some minor element of realism.
Also, your wish should be something your subconscious can help you with. For example, instead of wishfully thinking about money appearing in your bank account, you could wishfully think about finding it on the sidewalk. Or, alternatively you could wishfully think about yourself as a money magnet.
If you previously did not bear such points in mind, you might want to consider revisiting the technique, to see if you can make something of it. Unless you figure you are already too optimistic, that is.