That doesn’t mesh with the experiments Harry and Hermione performed in chapter 22. Or at least not without a complication penalty that would make alternative explanations more plausible.
Roxolan
Meetup : Brussels—The Art of Not Being Right
Meetup : Brussels: March meetup (1PM) + Harry Potter MoR Party (6PM)
Harry can control the order of a transfiguration process, as seen in ch.104. Those are not threads floating freely in the air, they’re part of a specific wire shape in the process of being transfigured. We also know that you can transfigure against tension.
Meetup : Brussels February meetup: Words
Meetup : Brussels—Mindfulness and mental habits
Meetup : Brussels—Hope & Self-improvement
I took it as a reminder of what was discussed in How to Actually Change Your Mind: confirmation bias, affective death spirals etc.
Meetup : Brussels November meetup: Hell and existential risks
Seconded. On Android I’m using FBReader with an Ivona voice (free, with the drawback that I have to re-download Ivona every couple of months). It works really well for non-fiction, even the Sequences with all its long made-up words.
It doesn’t work so well with fantasy/sci-fi though. Made-up words without an English root trip it up.
The work-in-progress Worm audiobook might be of use then.
Starting from chapter 10, the protagonist dedicates herself to a single goal, and never wavers from that goal no matter what it costs her throughout countless lifetimes. She cheats with many-worlds magic, but it’s a kind of magic that still requires as much hard work as the real thing.
I smiled when I realized why the answer isn’t trivially “press sim”, but that slight obfuscation is causing a lot of confused people to get downvoted.
If you decide not to press “sim”, you know that there are no simulations. It’s impossible for there to be an original who presses “sim” only for the simulations to make different decisions. You’re the original and will leave with 0.9.
If you decide to press “sim”, you know that there are 1000 simulations. You’ve only got a 1 in 1001 chance of being the original. Your expected utility for pressing the button is slightly more than 0.2.
- 25 Sep 2014 22:05 UTC; 2 points) 's comment on Simulation argument meets decision theory by (
Working on my first serious project using AndEngine (a game that’s a cross between Recettear and Night Shift). The joy of puzzling code out without any documentation. I’m at the stage where I can display the shop and have customers come in and wobble around, without there being any actual gameplay.
Meetup : Brussels—September meetup
I don’t think it’s a logical fallacy at all. I mean, anyone who changes their mind about cryonics because of the promise of future Margaret Atwood is probably not being very rational, but formally there’s nothing wrong with that reasoning.
I’m an Atwood-reading robot. I exist only to read every Margaret Atwood novel. I expect to outlive her, so the future holds nothing of value to me. No need for cryonics. Oh but what’s this? A secret Atwood novel to be released in 2114? Sign me up! I’ll go back to suicidal apathy after I’ve read the 2114 novel.
You’d keep it in your hand and use it as an improvised hammer to carefully break yourself a big enough hole. Hopefully without collapsing the whole house.
If you’re trapped in a glass house and you have a stone, throwing it is still a terrible idea.
I managed to get it to output this prompt. It’s possible it’s hallucinating some or all of it, but the date at least was correct.