Sorry, I wasn’t clear. I meant “if I was in roughly Harry’s situation in HP:MoR” (and “I would read about common plot twists that seemed pertinent to the sort of story that I found myself in”, not “I would already know which story I was in, recursively” though that would be pretty cool). I almost agree with your analysis except for decision theoretic reasons I never expect to find myself as a muggle in any story. (“My” “self” grumble dissatisfaction grumble.)
I feel like I should make a bet, but it’s a poor habit to make bets on the tails of distributions. (Meaning I suspect that I’m slightly less sure I can’t do magick than others would be but I’m still pretty damn sure I can’t, at least not in a way that others would say could be legitimately described as magick.)
Based on http://predictionbook.com/predictions/3377 I’d think you’d have to assign at least a 5% chance that you can if I’m reading this correctly since this is only one possible method of using magic. Is 5% close enough to the tail that you don’t think bets should be made over it?
No, that’s a different reading of “can”; I guess by “can” I meant “I currently use magick but am unaware that I am doing so”; if we were talking about potential to learn magick then I’d have to put it at around 5%. Me unknowingly doing magick is more like a you know actually I’d rather not talk about that. Still probably a lot higher than others would guess but magnitudes lower than 5%.
I meant “if I was in roughly Harry’s situation in HP:MoR”
Can you expand on what part of his situation would do that? What is the scenario in question? Someone shows up at your door tomorrow and tells you that you are a wizard? I’m still not clear what situation is the one in question.
for decision theoretic reasons I never expect to find myself as a muggle in any story.
What do you mean? I can understand the argument that random muggles aren’t likely to be simulated to full fidelity and so entities that have enough processing power to act as observers shouldn’t expect themselves to be random bystanders in a story about people they never meet. But this has nothing to do with decision theory so you seem to be driving at some other point.
It’s really hard for me to answer your first question. Basically everything about HP:MoR has been optimized to be a good story, so I’m tempted to answer “everything”, but I realize that isn’t helpful. But for some reason I find listing things aversive. Um. Um?
The simulation fidelity thing is actually I think equivalent to the decision theory thing; or at least, fidelity of simulation is directly correlated with decision theoretic significance. I don’t anticipate being understood and thus can’t muster up the energy to try to be understood, recursively. I’m sorry. But the simulation thing is basically close enough, yeah.
It’s really hard for me to answer your first question. Basically everything about HP:MoR has been optimized to be a good story, so I’m tempted to answer “everything”, but I realize that isn’t helpful.
I’m not sure someone in a good story would recognize that they are in a story even when it is highly optimized. From the reader’s perspective Harry might be interesting but even from his perspective he’s spent days in classes, he’s spent hours listening to Professor Binns drone on, he’s had to do tedious homework, and he’s had 11 years where he was just like a lot of other very smart kids, many of whom beat him in math contests.
Right, if you start from decision theory then the prior is high and if you start from naive realism then the prior is really low, but I mean, the likelihood ratio started out high the very moment he realized he was abnormally intelligent and he had three last names, and ever since then it just keeps getting bigger, and bigger, and bigger, and bigger, and bigger, and bigger, and...
He lost in math contests, but I think he thought himself smarter than almost all other humans along the dimensions that actually mattered. He explicitly has a messianic complex.
Right, if you start from decision theory then the prior is high and if you start from naive realism then the prior is really low, but I mean, the likelihood ratio started out high the very moment he realized he was abnormally intelligent and he had three last names, and ever since then it just keeps getting bigger, and bigger, and bigger, and bigger, and bigger, and bigger, and...
Three last names is not that uncommon, and there are a lot of abnormally intelligent people in the world. Of the people who are in the top tenth of a percent in intelligence there have to be around six million people on the planet who fit that. So the chance that anything special is happening is still really low at that point. The chance might get higher overtime. And it helps that Harry is genre aware enough to sarcastically ask if there’s a prophecy about him. (That section is still by far one of my favorite parts of the story.) So he’s already located the hypothesis to some extent although he may have located it due more to pattern matching than actual evidence. Moreover, at the same time, Harry knows from talking to Dumbledore and reading old books about Gryffindor and others that in their universe there is such a thing as heroes. So Harry doesn’t have a strong reason to see why his heroism isn’t different than Gryffindor’s. He might be the character with well meaning intentions who goes evil so that someone can arise to stop him in a few years. He’ll be the classic MagiTech using villain, and it might even have a big anti-transhumanist undercurrent.
It seems that you might be engaging in a weird form of hindsight bias together with possibly the illusion of transparency.
I don’t follow. Or, it seemed like you were listing reasons why he should suspect he’s in a story, but then you seemed to think I was committing hindsight bias for thinking so. Is it because “3 hours” is an exaggeratedly short time to make the inference? ’Cuz “I’m the main character” had secretly been Harry’s hypothesis since forever, as was revealed under the Sorting Hat.
ETA: (Main character status qua main character status is hard to get without teleological optimization, and that’s hard to get without authors behind the scene. (Evolution counts as an author but that just gets rolled into your baseline… I can’t easily express that. I feel a faint urge to cry.))
The first part is a response to your observation about abnormal intelligent. My point was that there are around six million people of Harry’s intelligence level, so being that intelligent is not a good reason to think one is a protagonist.
I then agreed that Harry’s aware of the genre in question which might help slightly.
My point about Gryffindor and the like was that even if Harry thinks he’s in a story he doesn’t have good reason to necessarily assume he’s the protagonist. In fact, look at the confusion from other genre aware characters about what genre they are in. So Harry could be a well simulated villain for later use.
Harry under the sorting Hat thinks he’s important. That’s not the same thing as thinking one is a protagonist in a story. Being willing to believe that one is the prophesied hero upon everything will fall is something lots of little kids want to believe. A lot of them convince themselves that they are somehow important or unique. (A weird example are Emo and Goth kids who convince themselves that they somehow have terrible suffering which no one else understands. Similarly, this is why X-Men is such a popular comic.)
The hindsight bias is that you know that Harry really is the main character. So in that context it is easy for you to look at all the evidence and say “yeah! See. It is obvious.”
(If I implied that Harry should know he’s the main character then I take that back; I only meant to say he should know he’s character of note in a story.)
I think that Harry’s supposed to be about one in 2 million intelligence, since Eliezer was at one in a million and Harry’s supposed to be somewhat smarter than Eliezer. Has this been discussed here before? If so, what were the conclusions?
I thought Harry was supposed to be as intelligent as Eliezer, but on the path sooner. Writing characters more intelligent than yourself is generally considered extremely difficult and not often done, though HPMoR breaks enough “rules” already that I wouldn’t be too surprised if you were correct.
And Father had finished by saying that plays like this were always unrealistic, because if the playwright had known what someone actually as smart as Light would actually do, the playwright would have tried to take over the world himself instead of just writing plays about it.
(Of course, nothing says he couldn’t write plays as a hobby...)
Interestingly it might be less hindsight bias than typical mind fallacy/heuristic. Harry is like Eliezer, I am like Eliezer, I am like Harry, and I’ve always guiltily thought I was the main character, and I’m pretty sure that Eliezer and Harry have too. (Luckily there are many stories going on.) Combined with all the explicit cues about wanting to become God and thinking he’s incredibly important and then being sorta vindicated by the prophecy...
“Being the Messiah is like being Athlete of the Year.”
if you start from decision theory then the prior is high
I feel obligated to explicitly note that literally interpreted this is a straightforward abuse of the words “decision theory” and “prior” even if the concept I’m getting at isn’t too abusive.
The simulation part makes more sense to me. You mean to say that you think every relevant mind either lives in a simulation or thinks they do?
Please forgive me for asking, but do you want an Argument that Saves everyone who would exist in Tegmark IV? Because I don’t believe this gets you there.
Sorry, I wasn’t clear. I meant “if I was in roughly Harry’s situation in HP:MoR” (and “I would read about common plot twists that seemed pertinent to the sort of story that I found myself in”, not “I would already know which story I was in, recursively” though that would be pretty cool). I almost agree with your analysis except for decision theoretic reasons I never expect to find myself as a muggle in any story. (“My” “self” grumble dissatisfaction grumble.)
I hate be the one to break this to you, Will, but… you can’t do magic. I’m sorry.
I feel like I should make a bet, but it’s a poor habit to make bets on the tails of distributions. (Meaning I suspect that I’m slightly less sure I can’t do magick than others would be but I’m still pretty damn sure I can’t, at least not in a way that others would say could be legitimately described as magick.)
Based on http://predictionbook.com/predictions/3377 I’d think you’d have to assign at least a 5% chance that you can if I’m reading this correctly since this is only one possible method of using magic. Is 5% close enough to the tail that you don’t think bets should be made over it?
No, that’s a different reading of “can”; I guess by “can” I meant “I currently use magick but am unaware that I am doing so”; if we were talking about potential to learn magick then I’d have to put it at around 5%. Me unknowingly doing magick is more like a you know actually I’d rather not talk about that. Still probably a lot higher than others would guess but magnitudes lower than 5%.
Are you willing/able to discuss the causes of your unusually high belief in magic?
Can you expand on what part of his situation would do that? What is the scenario in question? Someone shows up at your door tomorrow and tells you that you are a wizard? I’m still not clear what situation is the one in question.
What do you mean? I can understand the argument that random muggles aren’t likely to be simulated to full fidelity and so entities that have enough processing power to act as observers shouldn’t expect themselves to be random bystanders in a story about people they never meet. But this has nothing to do with decision theory so you seem to be driving at some other point.
It’s really hard for me to answer your first question. Basically everything about HP:MoR has been optimized to be a good story, so I’m tempted to answer “everything”, but I realize that isn’t helpful. But for some reason I find listing things aversive. Um. Um?
The simulation fidelity thing is actually I think equivalent to the decision theory thing; or at least, fidelity of simulation is directly correlated with decision theoretic significance. I don’t anticipate being understood and thus can’t muster up the energy to try to be understood, recursively. I’m sorry. But the simulation thing is basically close enough, yeah.
I’m not sure someone in a good story would recognize that they are in a story even when it is highly optimized. From the reader’s perspective Harry might be interesting but even from his perspective he’s spent days in classes, he’s spent hours listening to Professor Binns drone on, he’s had to do tedious homework, and he’s had 11 years where he was just like a lot of other very smart kids, many of whom beat him in math contests.
Right, if you start from decision theory then the prior is high and if you start from naive realism then the prior is really low, but I mean, the likelihood ratio started out high the very moment he realized he was abnormally intelligent and he had three last names, and ever since then it just keeps getting bigger, and bigger, and bigger, and bigger, and bigger, and bigger, and...
He lost in math contests, but I think he thought himself smarter than almost all other humans along the dimensions that actually mattered. He explicitly has a messianic complex.
Three last names is not that uncommon, and there are a lot of abnormally intelligent people in the world. Of the people who are in the top tenth of a percent in intelligence there have to be around six million people on the planet who fit that. So the chance that anything special is happening is still really low at that point. The chance might get higher overtime. And it helps that Harry is genre aware enough to sarcastically ask if there’s a prophecy about him. (That section is still by far one of my favorite parts of the story.) So he’s already located the hypothesis to some extent although he may have located it due more to pattern matching than actual evidence. Moreover, at the same time, Harry knows from talking to Dumbledore and reading old books about Gryffindor and others that in their universe there is such a thing as heroes. So Harry doesn’t have a strong reason to see why his heroism isn’t different than Gryffindor’s. He might be the character with well meaning intentions who goes evil so that someone can arise to stop him in a few years. He’ll be the classic MagiTech using villain, and it might even have a big anti-transhumanist undercurrent.
It seems that you might be engaging in a weird form of hindsight bias together with possibly the illusion of transparency.
This whole conversation is just so hilarious.
MoRdore approves!
I don’t follow. Or, it seemed like you were listing reasons why he should suspect he’s in a story, but then you seemed to think I was committing hindsight bias for thinking so. Is it because “3 hours” is an exaggeratedly short time to make the inference? ’Cuz “I’m the main character” had secretly been Harry’s hypothesis since forever, as was revealed under the Sorting Hat.
ETA: (Main character status qua main character status is hard to get without teleological optimization, and that’s hard to get without authors behind the scene. (Evolution counts as an author but that just gets rolled into your baseline… I can’t easily express that. I feel a faint urge to cry.))
Sorry if that wasn’t clear.
The first part is a response to your observation about abnormal intelligent. My point was that there are around six million people of Harry’s intelligence level, so being that intelligent is not a good reason to think one is a protagonist.
I then agreed that Harry’s aware of the genre in question which might help slightly.
My point about Gryffindor and the like was that even if Harry thinks he’s in a story he doesn’t have good reason to necessarily assume he’s the protagonist. In fact, look at the confusion from other genre aware characters about what genre they are in. So Harry could be a well simulated villain for later use.
Harry under the sorting Hat thinks he’s important. That’s not the same thing as thinking one is a protagonist in a story. Being willing to believe that one is the prophesied hero upon everything will fall is something lots of little kids want to believe. A lot of them convince themselves that they are somehow important or unique. (A weird example are Emo and Goth kids who convince themselves that they somehow have terrible suffering which no one else understands. Similarly, this is why X-Men is such a popular comic.)
The hindsight bias is that you know that Harry really is the main character. So in that context it is easy for you to look at all the evidence and say “yeah! See. It is obvious.”
(If I implied that Harry should know he’s the main character then I take that back; I only meant to say he should know he’s character of note in a story.)
I think that Harry’s supposed to be about one in 2 million intelligence, since Eliezer was at one in a million and Harry’s supposed to be somewhat smarter than Eliezer. Has this been discussed here before? If so, what were the conclusions?
I thought Harry was supposed to be as intelligent as Eliezer, but on the path sooner. Writing characters more intelligent than yourself is generally considered extremely difficult and not often done, though HPMoR breaks enough “rules” already that I wouldn’t be too surprised if you were correct.
(Of course, nothing says he couldn’t write plays as a hobby...)
Interestingly it might be less hindsight bias than typical mind fallacy/heuristic. Harry is like Eliezer, I am like Eliezer, I am like Harry, and I’ve always guiltily thought I was the main character, and I’m pretty sure that Eliezer and Harry have too. (Luckily there are many stories going on.) Combined with all the explicit cues about wanting to become God and thinking he’s incredibly important and then being sorta vindicated by the prophecy...
“Being the Messiah is like being Athlete of the Year.”
I feel obligated to explicitly note that literally interpreted this is a straightforward abuse of the words “decision theory” and “prior” even if the concept I’m getting at isn’t too abusive.
The simulation part makes more sense to me. You mean to say that you think every relevant mind either lives in a simulation or thinks they do?
Please forgive me for asking, but do you want an Argument that Saves everyone who would exist in Tegmark IV? Because I don’t believe this gets you there.