...too often the poster seems to be saying “look, absurd result, but the numbers work out so this is important!” rather than “oh, I hit an absurdity, perhaps I’m stretching this way further than it goes.”
Yes, I don’t understand this at all. For example, even Yudkowsky writes that he would sooner question his grasp of “rationality” than give five dollars to a Pascal’s Mugger because he thought it was “rational”. Now as far as I can tell, they still use this framework to make decisions, a framework that implies absurd decisions, rather than concentrating on examining the framework itself, and looking for better alternatives.
What I am having problems with is that they seem to teach people to “shut up and multiply”, and approximate EU maximization, yet arbitrarily ignore low probabilities. I say “arbitrarily” because nobody ever told me at what point it is rational to step out of this framework and ignore a calculation.
You could argue that our current grasp of rationality is less wrong. But why then worry about something as dutch booking when any stranger can make you give them all your money simply by conjecturing vast utilities if you don’t? Seems more wrong to me.
Lots of frameworks imply different absurd decisions (especially when viewed from other frameworks) but it’s hard to go about your life without using some sort of framework.
If rationality is on average less wrong but you think your intuition is better in a certain scenario, a mixed strategy makes sense.
If rationality is on average less wrong but you think your intuition is better in a certain scenario, a mixed strategy makes sense.
No, it means your intuition is better than your rationality, and you should fix that. If your rational model is not as good as your intuition at making decisions, then it is flawed and you need to move on.
Let’s say I have 300 situations where I recorded my decision making process. I tried to use rationality to make the right decision in all of them, and kept track of whether I regretted the outcome. In 100 of these situations, my intuitions disagreed with my rational model, and I followed my rational model. If I only regret the outcome in 1 of these 100 situations, in what way does make sense to throw out my model? You can RATIONALLY decide that certain situations are not amenable to your rational framework without deciding the framework is without value.
Let’s say we do 100 physics experiments, and 99% of the results agree with our model. Do we get to ignore / throw out that one “erroneous” result? No, that result if verified shows a flaw in our model.
If afterwards you regretted a choice and wish you had made a better choice even with the information available to you at the time, then this realization should have bolt upright in your chair. If verified, your decision making process needs updating.
it’s still a pretty damn good model. Why can’t you get that point? Newtonian mechanics was still a very useful model and would’ve been ridiculous to replace with intuition just because it gave absurd answers in relativistic situations.
I never contradicted that point. Newtonian physics works quite fine in many situations. It is still wrong.
Edit: to expand on that point when we use physics we know that there a certain circumstances in whichwe use classical physics because it is easier and faster and the results are good enough for the precision we need. Other times we use quantum physics or relativity. the decision of which model to use is itself part of the decision-making frameworks and is what I’m talking about. if you chose to use the wrong framework and get incorrect results then your metamodel of which framework to use use to be updated.
Yes, I don’t understand this at all. For example, even Yudkowsky writes that he would sooner question his grasp of “rationality” than give five dollars to a Pascal’s Mugger because he thought it was “rational”. Now as far as I can tell, they still use this framework to make decisions, a framework that implies absurd decisions, rather than concentrating on examining the framework itself, and looking for better alternatives.
What I am having problems with is that they seem to teach people to “shut up and multiply”, and approximate EU maximization, yet arbitrarily ignore low probabilities. I say “arbitrarily” because nobody ever told me at what point it is rational to step out of this framework and ignore a calculation.
You could argue that our current grasp of rationality is less wrong. But why then worry about something as dutch booking when any stranger can make you give them all your money simply by conjecturing vast utilities if you don’t? Seems more wrong to me.
Lots of frameworks imply different absurd decisions (especially when viewed from other frameworks) but it’s hard to go about your life without using some sort of framework.
If rationality is on average less wrong but you think your intuition is better in a certain scenario, a mixed strategy makes sense.
No, it means your intuition is better than your rationality, and you should fix that. If your rational model is not as good as your intuition at making decisions, then it is flawed and you need to move on.
You seem to have completely missed my point.
Let’s say I have 300 situations where I recorded my decision making process. I tried to use rationality to make the right decision in all of them, and kept track of whether I regretted the outcome. In 100 of these situations, my intuitions disagreed with my rational model, and I followed my rational model. If I only regret the outcome in 1 of these 100 situations, in what way does make sense to throw out my model? You can RATIONALLY decide that certain situations are not amenable to your rational framework without deciding the framework is without value.
Let’s say we do 100 physics experiments, and 99% of the results agree with our model. Do we get to ignore / throw out that one “erroneous” result? No, that result if verified shows a flaw in our model.
If afterwards you regretted a choice and wish you had made a better choice even with the information available to you at the time, then this realization should have bolt upright in your chair. If verified, your decision making process needs updating.
it’s still a pretty damn good model. Why can’t you get that point? Newtonian mechanics was still a very useful model and would’ve been ridiculous to replace with intuition just because it gave absurd answers in relativistic situations.
I never contradicted that point. Newtonian physics works quite fine in many situations. It is still wrong.
Edit: to expand on that point when we use physics we know that there a certain circumstances in whichwe use classical physics because it is easier and faster and the results are good enough for the precision we need. Other times we use quantum physics or relativity. the decision of which model to use is itself part of the decision-making frameworks and is what I’m talking about. if you chose to use the wrong framework and get incorrect results then your metamodel of which framework to use use to be updated.