“All men are created equal” is false insofar as looking at the atom configurations: every human is unique (by Quantum Indistinguishability, any non-unique humans are the same human). On the other hand it is probably true insofar as the CEV Morality Computation says.
“The lottery is a waste of hope” is true by an expected capital gain calculation (net loss) and putting mental energy into something that is a net loss is worse by utilitarianism than something that is a net gain (working hard and hoping you get rich) because at the very least, having money allows you to donate more to charity.
“Religious people are intolerant,” largely depends on the religion how much of the scripture is “boo for unbelievers,” but it seems to almost always incite ingroup-outgroup dichotomies in the psychology of the believers, and that I think is pretty much what intolerance springs from.
“Government is not the solution; government is the problem” is false, because humans generally need more than ‘fairness’ to avoid defection on the iterated prisoners dilemma. Many don’t realise that bureaucracy is fair, if often slow and bloated. A real chance of punishment decided by fair trial is very effective at deterring defectors.
“George Washington was a better president than James Buchanan,” depends solely on the criterion. Peoples opinion? Historical attitude? GDP growth rates? Precentage of votes won at election? Your own boo/yay rating? Mix and match as you like.
“The economy is doing worse today than it was ten years ago,” depends on whether you look at GDP growth or GDP, global economic crisis means lower growth, but we are still richer than ten years ago.
“God exists,” very unlikely, courtesy of Solomonoff Induction.
“One impulse from a vernal wood can teach you more of man, of moral evil, and of good than all the sages can,” bluh? Is this some sort of pop culture reference? Depends on what vernal wood is: if it is something akin to the Akashic record granting omniscience, then it is true. If it is anything else, probably not.
“Imagination is more important than knowledge,” false, the more knowledge you have, the more your lawful creativity can search, the better your imagination, the better you can use what you have gained with the eleventh virtue. Both are roughly equally important.
“Rationalists should win,” mathematical tautology. Perfectly rational bayesian expected utility maximizers do just that. As humans, it is a good heuristic to avoid privileged rituals of thought.
“Rationalists should win,” mathematical tautology. Perfectly rational bayesian expected utility maximizers do just that. As humans, it is a good heuristic to avoid privileged rituals of thought.
There can be value in tautology for the purpose of drawing attention to an important point: “oh, I’m not winning, I am not a rationalist, then.”
Exactly. When you are currently not holding one million dollars from one-boxing, then you are being irrational (assuming monotonically increasing utility from money), and should self-modify accordingly.
“Rationalists should win,” mathematical tautology. Perfectly rational bayesian expected utility maximizers do just that.
Just clarify for me, the way you use the phrase would you say a perfectly rational Bayesian expected utility maximizer takes one box or two in Newcomb’s problem? Plenty of people would claim that that particular combination of terms refers to a particular kind of agent (and meaning of ‘rational’) which two boxes. The phrase “Rationalists should win” comes built in with the unambiguous “one box” prescription. Those people would therefore either say that the phrase “rationalists should win” is tautologically false or perhaps insist on different language.
The BayRatUtilMax agent I am talking about is of course running the One True Decision Theory which one boxes, is immune to acausal blackmail, and all sorts of other nice features.
“One impulse from a vernal wood can teach you more of man, of moral evil, and of good than all the sages can,” bluh? Is this some sort of pop culture reference? Depends on what vernal wood is: if it is something akin to the Akashic record granting omniscience, then it is true. If it is anything else, probably not.
It’s a Wordsworth quote. Also “vernal wood” is like “vernal pool”, where vernal means ‘of or pertaining to Spring’. It is reducible and false.
“All men are created equal” is false insofar as looking at the atom configurations: every human is unique (by Quantum Indistinguishability, any non-unique humans are the same human).
Uniqueness does not imply inequality without the additional assumption of a measure of ‘value’ which it seems the other speaker clearly doesn’t share.
Your rituals of thought should pay rent too. If you follow the way of ‘rationality’ and bad things happen, then you should find out what you are doing wrong, and act accordingly. There is no such excuse as ‘but I did everything I am supposed to’.
There is no such excuse as ‘but I did everything I am supposed to’.
Huh?
Imagine a lottery with a $500 prize, 100 tickets sold for a dollar each. The rational thing to do is buy every ticket you can. But you get to the sales office too late, and one ticket has already been sold. You buy the remainder, but don’t win the lottery. You ended up losing money, but you did everything right, didn’t you?
Well, rationalists should end up “winning” insofar as winning means “doing better than non-rationalists ON AVERAGE.
Then again, it doesn’t mean all rationalists end up living 120 years old and extremely rich. If yo are a non-rationalist born with 1 billion of dollars on your bank account you’ll probably end up richer than a rationalist born in North korea in a poor family with no legs and no arms.
But on the other hand, if you cannot identify the causes for your defeats as completely independant of yourself, it probably means you are doing something wrong or at least not optimally.
In the lottery example above, there is 99 other worlds where the rationalist who bought the tickets is better off than the man who did not (unless the lottery is rigged, in which case the rationalist is the one who realised that this smells funny and doesn’t buy tickets). Or more intuitively, if there is a lot of such lotteries, the Rationalist buying the tickets every time will end up richer than the man who doesn’t.
IN YOUR LIFE, there is probably enough such “lotteries” for you to end up better off if you are rationalist than if you are not, and reliably so.
(and “you did everything right” but maybe the right thing to do would have been to arrive at the sales office earlier).
On the other hand, you should expect such a thing to only happen 1% of the times in average, so if you’re consistently unlucky for a long period of time, odds are you’re doing something wrong.
It is not in principle possible to do better in that scenario. It is in principle possible to do better than, say, two-boxing on Newcomb’s problem, even though a CDT agent always does that.
If I randomly get hit by a meteor, there isn’t a lot I could have done to avoid it. If I willingly drive faster than the speed limit and get myself killed in an accident, there isn’t a lot of excuses for why not to abide by the speed limit and survive.
“All men are created equal” is false insofar as looking at the atom configurations: every human is unique (by Quantum Indistinguishability, any non-unique humans are the same human). On the other hand it is probably true insofar as the CEV Morality Computation says.
“The lottery is a waste of hope” is true by an expected capital gain calculation (net loss) and putting mental energy into something that is a net loss is worse by utilitarianism than something that is a net gain (working hard and hoping you get rich) because at the very least, having money allows you to donate more to charity.
“Religious people are intolerant,” largely depends on the religion how much of the scripture is “boo for unbelievers,” but it seems to almost always incite ingroup-outgroup dichotomies in the psychology of the believers, and that I think is pretty much what intolerance springs from.
“Government is not the solution; government is the problem” is false, because humans generally need more than ‘fairness’ to avoid defection on the iterated prisoners dilemma. Many don’t realise that bureaucracy is fair, if often slow and bloated. A real chance of punishment decided by fair trial is very effective at deterring defectors.
“George Washington was a better president than James Buchanan,” depends solely on the criterion. Peoples opinion? Historical attitude? GDP growth rates? Precentage of votes won at election? Your own boo/yay rating? Mix and match as you like.
“The economy is doing worse today than it was ten years ago,” depends on whether you look at GDP growth or GDP, global economic crisis means lower growth, but we are still richer than ten years ago.
“God exists,” very unlikely, courtesy of Solomonoff Induction.
“One impulse from a vernal wood can teach you more of man, of moral evil, and of good than all the sages can,” bluh? Is this some sort of pop culture reference? Depends on what vernal wood is: if it is something akin to the Akashic record granting omniscience, then it is true. If it is anything else, probably not.
“Imagination is more important than knowledge,” false, the more knowledge you have, the more your lawful creativity can search, the better your imagination, the better you can use what you have gained with the eleventh virtue. Both are roughly equally important.
“Rationalists should win,” mathematical tautology. Perfectly rational bayesian expected utility maximizers do just that. As humans, it is a good heuristic to avoid privileged rituals of thought.
There can be value in tautology for the purpose of drawing attention to an important point: “oh, I’m not winning, I am not a rationalist, then.”
Exactly. When you are currently not holding one million dollars from one-boxing, then you are being irrational (assuming monotonically increasing utility from money), and should self-modify accordingly.
Just clarify for me, the way you use the phrase would you say a perfectly rational Bayesian expected utility maximizer takes one box or two in Newcomb’s problem? Plenty of people would claim that that particular combination of terms refers to a particular kind of agent (and meaning of ‘rational’) which two boxes. The phrase “Rationalists should win” comes built in with the unambiguous “one box” prescription. Those people would therefore either say that the phrase “rationalists should win” is tautologically false or perhaps insist on different language.
The BayRatUtilMax agent I am talking about is of course running the One True Decision Theory which one boxes, is immune to acausal blackmail, and all sorts of other nice features.
It’s a Wordsworth quote. Also “vernal wood” is like “vernal pool”, where vernal means ‘of or pertaining to Spring’. It is reducible and false.
Uniqueness does not imply inequality without the additional assumption of a measure of ‘value’ which it seems the other speaker clearly doesn’t share.
‘Equal’ meaning ‘Copy’ is false, ‘Equal’ meaning ‘Value in Ethical Utilons’ is true.
Privileged Rituals of Thought?
“If you are so smart why aren’t you rich.”
Your rituals of thought should pay rent too. If you follow the way of ‘rationality’ and bad things happen, then you should find out what you are doing wrong, and act accordingly. There is no such excuse as ‘but I did everything I am supposed to’.
Huh?
Imagine a lottery with a $500 prize, 100 tickets sold for a dollar each. The rational thing to do is buy every ticket you can. But you get to the sales office too late, and one ticket has already been sold. You buy the remainder, but don’t win the lottery. You ended up losing money, but you did everything right, didn’t you?
Well, rationalists should end up “winning” insofar as winning means “doing better than non-rationalists ON AVERAGE.
Then again, it doesn’t mean all rationalists end up living 120 years old and extremely rich. If yo are a non-rationalist born with 1 billion of dollars on your bank account you’ll probably end up richer than a rationalist born in North korea in a poor family with no legs and no arms.
But on the other hand, if you cannot identify the causes for your defeats as completely independant of yourself, it probably means you are doing something wrong or at least not optimally.
In the lottery example above, there is 99 other worlds where the rationalist who bought the tickets is better off than the man who did not (unless the lottery is rigged, in which case the rationalist is the one who realised that this smells funny and doesn’t buy tickets). Or more intuitively, if there is a lot of such lotteries, the Rationalist buying the tickets every time will end up richer than the man who doesn’t.
IN YOUR LIFE, there is probably enough such “lotteries” for you to end up better off if you are rationalist than if you are not, and reliably so.
(and “you did everything right” but maybe the right thing to do would have been to arrive at the sales office earlier).
On the other hand, you should expect such a thing to only happen 1% of the times in average, so if you’re consistently unlucky for a long period of time, odds are you’re doing something wrong.
It is not in principle possible to do better in that scenario. It is in principle possible to do better than, say, two-boxing on Newcomb’s problem, even though a CDT agent always does that.
If I randomly get hit by a meteor, there isn’t a lot I could have done to avoid it. If I willingly drive faster than the speed limit and get myself killed in an accident, there isn’t a lot of excuses for why not to abide by the speed limit and survive.