I’ve got to start listening to those quiet, nagging doubts.
MinibearRex
BTW, the post says that spoilers from the original canon don’t need to be in rot13.
Their hearts stop beating, and they stop needing to breathe during the turning process.
I plan to keep doing reruns through “Final Words”, which will be posted two days from now. After the reruns are done, I have no particular plans to keep going. I had planned to create a post to prompt discussion as to future plans, but I don’t plan to personally do another rerun.
To try to be happy is to try to build a machine with no other specification than that it shall run noiselessly. -Robert Oppenheimer, 1929
I don’t think EY actually suggests that people are doing those calculations. He’s saying that we’re just executing an adaptation that functioned well in groups of a hundred or so, but don’t work nearly as well anymore.
The trouble is that there is nothing in epistemic rationality that corresponds to “motivations” or “goals” or anything like that. Epistemic rationality can tell you that pushing a button will lead to puppies not being tortured, and not pushing it will lead to puppies being tortured, but unless you have an additional system that incorporates desires for puppies to not be tortured, as well as a system for achieving those desires, that’s all you can do with epistemic rationality.
I think you’re confusing Pascal’s Wager with Pascal’s Mugging. The problem with Pascal’s Mugging is that the payoffs are really high. The problem with Pascal’s Wager is that it fails to consider any hypotheses other than “there is the christian god” and “there is no god”.
I’m not really sure that counts as faith. Faith usually implies something like “believing something without concern for evidence”. And in fact, the evidence I have fairly strongly indicates is that when I step into an airplane, I’m not going to die.
Including that one?
Hanson’s reply may be worth reading.
Probably because very few people propose playing solitaire and Settlers of Catan forever as their version of a Utopia. Spending eternity playing games on the holodeck, however, is frequently mentioned.
By the way, there may be some interruptions to posting sequence reruns over the course of the next week. Unfortunately, I’m going to be traveling and working on an odd schedule that may not let me reliably spend some time daily posting these things. I’ll try to get to it as much as possible, but I apologize in advance if I miss a few days.
I tend to use the word fun.
We finish with high confidence in the script’s authenticity
If you’re already familiar this particular leaked 2009 live-action script, please write down your current best guess as to how likely it is to be authentic.
Unless someone already tried to come up with an explicit probability, this ordering will bias the results. Ask people for their guesses before you tell them what you have already written on the subject.
Your competition story qualifies you for an upvote, for munchkinry.
It’s a pretty good idea for a sentence, too.
I will note that this seems as though it ought to be a problem that we can gather data on. We don’t have to theorize if we can look find a good sampling of cases in which a minister said they would resign, and then look at when they actually resigned.
Additionally, this post is mostly about a particular question involving anticipating political change, but the post title sounds like a more abstract issue in probability theory (how we should react if we learn that we will believe something at some later point).
And with this post, we have reached the last post in the 2008 Hanson-Yudkowsky AI Foom Debate. Starting tomorrow, we return to the regularly scheduled sequence reruns, and start moving into the Fun Theory Sequence.
I would advise putting a little bit more effort into formatting. Some of the font jumps are somewhat jarring, and prevent your post from having as much of an impact as you might hope.
Harry Potter and the Confirmed Critical, Chapter 6