It was abruptly very clear that while Harry was going around trying to live the ideals of the Enlightenment, Dumbledore was the one who’d actually fought in a war. Nonviolent ideals were cheap to hold if you were a scientist, living inside the Protego bubble cast by the police officers and soldiers whose actions you had the luxury to question. Albus Dumbledore seemed to have started out with ideals at least as strong as Harry’s own, if not stronger; and Dumbledore hadn’t gotten through his war without losing friends and killing enemies and sacrificing allies.
For commentary, we turn to Bismarck: “A fool learns from his mistakes, but a truly wise man learns from the mistakes of others.”
Even if Dumbledore was right, and the true enemy was utterly mad and evil… in a hundred million years the organic lifeform known as Lord Voldemort probably wouldn’t seem much different from all the other bewildered children of Ancient Earth. Whatever Lord Voldemort had done to himself, whatever Dark rituals seemed so horribly irrevocable on a merely human scale, it wouldn’t be beyond curing with the technology of a hundred million years. Killing him, if you didn’t have to do it, would be just one more death for future sentient beings to be sad about. How could you look up at the stars, and believe anything else?
Do Achilles and Odysseus not seem too different to modern eyes? No- one is pride and folly, and the other prudence and wisdom. But unfortunately one must be Odysseus to know that, and Harry is an Achilles.
History remembers actions; fiction remembers people. And so Harry thinks that the future will remember everyone currently alive as they are in fiction, rather than as their deeds show them to be. Indeed, once you have “cured” Voldemort by scooping out his will and past, what remains? Why does he think the future will hold life to be as precious as the present does, instead of cheap, as it did and will again in Malthusian economies?
Harry does not look at the stars; he looks at himself. He would do better to look at others.
Intentions and having already outright declared what ‘shall not be’ (see dementor scene) are sufficient for at least establishing what possible futures Harry cares about and plans on happening. (I personally criticized said scene because it seemed like cheap overconfident grandstanding of the kind fitting to an 11 year old.)
I’m confused is this supposed to be a criticism of the writing or of Harry?
Harry, to a certain extent. As well as not the writing per se, but the brand of transhumanist bluster written about. Where some we emotionally roused to cheering, I cringed.
Vaniver was talking in terms of predictions about what the future people would think. You responded in terms of what Harry wanted to happen. Unless you’re making a statement about Harry suffering from some form of bias here (in which case your comments are unclear) then Vaniver is right. Intentions have no effect on what the future actually will be. Predicting what will happen in the long term is different then planning to cause something to happen in the short term.
Vaniver was talking in terms of predictions about what the future people would think. You responded in terms of what Harry wanted to happen.
Vaniver was talking about Harry’s evaluation of the future outcomes. Once again, I point out Harry’s forceful and unambiguous declarations to the dementor about what the future ‘shall’ be and assert relevance of that kind of thinking to how Harry would evaluate the thoughts of of the people he labels as those from the future.
Intentions have no effect on what the future actually will be.
I’ve heard about a particular neurological condition (typically caused by traumatic brain injury) where this is the case. For the rest of us intentions do have an effect. (That’s kind of the main reason we have them in the first place.)
Vaniver wasn’t talking about Harry’s evaluation of future outcomes, he was talking about Harry’s predictions of future thoughts that future people would have. That’s why Vaniver said “why does he think the future will hold life to be precious”, etc. “He think the future will” clearly refers to a prediction made by Harry.
So your response is only relevant if you were trying to say Harry’s predictions were tainted by his value judgements. But I don’t think that’s what you were saying, correct?
Intentions have no impact on the future, only actions do. Unless you want to pretend that the neurons firing around in your brain are causally significant (in terms of effects to the outside world) in any substantive way, which would be dumb. Harry “declaring” that he considers death unacceptable and intends to stop it is “insufficient” to cause immortality. He would need to take actions like making an immortality pill and giving it to everyone, or something.
Vaniver wasn’t talking about Harry’s evaluation of future outcomes, he was talking about Harry’s predictions of future thoughts that future people would have. That’s why Vaniver said “why does he think the future will hold life to be precious”, etc. “He think the future will” clearly refers to a prediction made by Harry.
I believe you are incorrectly modelling the way Harry thinks and misunderstand the implications of the words Harry has uttered. The implicit prediction is conditional. On, for example, not catastrophic failure and extinction. To illustrate the position: Harry would not change the thinking here or the degree to which his meaning is valid if he happened to believe that there was a 95% chance of human extinction instead of any possible evaluation of future humans.
So your response is only relevant if you were trying to say Harry’s predictions were tainted by his value judgements. But I don’t think that’s what you were saying, correct?
That is not my primary point. I would perhaps also say that this is likely. Or at least that he uses overconfident rhetoric when expressing himself, to a degree that my instincts warn me to disaffiliate.
Intentions have no impact on the future, only actions do. Unless you want to pretend that the neurons firing around in your brain are causally significant (in terms of effects to the outside world) in any substantive way, which would be dumb.
I assert the thing that you say is dumb. My model of causality doesn’t consider atoms inside the computational structure of powerful optimization agents to be qualitatively different in causal significance to atoms outside of such entities. Neurons firing around in powerful brains are among the most causally significant things in existence.
Do Achilles and Odysseus not seem too different to modern eyes?
No, they’re both violent primitive barbarians. One preferred a bow, the other a spear, if I remember correctly. And Harry is not trying to look mere thousands of years into the future.
Why does he think the future will hold life to be as precious as the present does, instead of cheap, as it did and will again in Malthusian economies?
No, I’m pretty sure Harry thinks the future will hold life to be much more precious than the present does.
As for why, probably bad reasoning, but I wouldn’t hold that against him. Moral progress maybe? The optimism of youth? Because if the future doesn’t hold things that are similar to us but better then it’s a Bad End and probably won’t hold anyone whose opinion we care about?
When it becomes possible to cheaply create life, then I expect Malthusian constraints to quickly become tight. (To be more precise, I mean that the long-term population growth rate minus death rate times per capita resource expenditure cannot exceed the resource growth rate.)
Why? In this world, energy is free. Which means, that with sufficient technology, all resources are free. As long as no one recklessly goes around creating resource-using life forms at an incredible rate, we should be fine...
I agree that it’s absurd to find it probable that the future beings would value all lives, even the lives of the incredibly stupid and evil ancients. It’s conceivable, and plausible, but we don’t have any evidence, and we have lots of bias from fiction.
But this comment doesn’t seem relevant to what Vaniver said. Harry being idealist in no way is related to the probability that future beings will think that Voldemort should or should not have been killed. Your comment makes sense if you’re addressing why Harry came to the conclusions he did, but not if you’re discussing whether or not the sentient creatures would value Voldemort’s life.
My point is that Harry’s beliefs are dumb(getting better, but still dumb), and therefore the conclusions that he draws from those beliefs are silly and poor.
That said, a thought occurs. Doe the True Patronus work for someone willing to do murder(even if it’s to save net lives)?
Two illustrations:
For commentary, we turn to Bismarck: “A fool learns from his mistakes, but a truly wise man learns from the mistakes of others.”
Do Achilles and Odysseus not seem too different to modern eyes? No- one is pride and folly, and the other prudence and wisdom. But unfortunately one must be Odysseus to know that, and Harry is an Achilles.
History remembers actions; fiction remembers people. And so Harry thinks that the future will remember everyone currently alive as they are in fiction, rather than as their deeds show them to be. Indeed, once you have “cured” Voldemort by scooping out his will and past, what remains? Why does he think the future will hold life to be as precious as the present does, instead of cheap, as it did and will again in Malthusian economies?
Harry does not look at the stars; he looks at himself. He would do better to look at others.
Because he has no intention of letting that happen.
Intentions are insufficient.
Intentions and having already outright declared what ‘shall not be’ (see dementor scene) are sufficient for at least establishing what possible futures Harry cares about and plans on happening. (I personally criticized said scene because it seemed like cheap overconfident grandstanding of the kind fitting to an 11 year old.)
I’m confused is this supposed to be a criticism of the writing or of Harry?
Harry, to a certain extent. As well as not the writing per se, but the brand of transhumanist bluster written about. Where some we emotionally roused to cheering, I cringed.
Vaniver was talking in terms of predictions about what the future people would think. You responded in terms of what Harry wanted to happen. Unless you’re making a statement about Harry suffering from some form of bias here (in which case your comments are unclear) then Vaniver is right. Intentions have no effect on what the future actually will be. Predicting what will happen in the long term is different then planning to cause something to happen in the short term.
Vaniver was talking about Harry’s evaluation of the future outcomes. Once again, I point out Harry’s forceful and unambiguous declarations to the dementor about what the future ‘shall’ be and assert relevance of that kind of thinking to how Harry would evaluate the thoughts of of the people he labels as those from the future.
I’ve heard about a particular neurological condition (typically caused by traumatic brain injury) where this is the case. For the rest of us intentions do have an effect. (That’s kind of the main reason we have them in the first place.)
Vaniver wasn’t talking about Harry’s evaluation of future outcomes, he was talking about Harry’s predictions of future thoughts that future people would have. That’s why Vaniver said “why does he think the future will hold life to be precious”, etc. “He think the future will” clearly refers to a prediction made by Harry.
So your response is only relevant if you were trying to say Harry’s predictions were tainted by his value judgements. But I don’t think that’s what you were saying, correct?
Intentions have no impact on the future, only actions do. Unless you want to pretend that the neurons firing around in your brain are causally significant (in terms of effects to the outside world) in any substantive way, which would be dumb. Harry “declaring” that he considers death unacceptable and intends to stop it is “insufficient” to cause immortality. He would need to take actions like making an immortality pill and giving it to everyone, or something.
I believe you are incorrectly modelling the way Harry thinks and misunderstand the implications of the words Harry has uttered. The implicit prediction is conditional. On, for example, not catastrophic failure and extinction. To illustrate the position: Harry would not change the thinking here or the degree to which his meaning is valid if he happened to believe that there was a 95% chance of human extinction instead of any possible evaluation of future humans.
That is not my primary point. I would perhaps also say that this is likely. Or at least that he uses overconfident rhetoric when expressing himself, to a degree that my instincts warn me to disaffiliate.
I assert the thing that you say is dumb. My model of causality doesn’t consider atoms inside the computational structure of powerful optimization agents to be qualitatively different in causal significance to atoms outside of such entities. Neurons firing around in powerful brains are among the most causally significant things in existence.
Erm...there isn’t even conservation of energy in that universe. Do you really think Malthusian economics still holds in such a world?
No, they’re both violent primitive barbarians. One preferred a bow, the other a spear, if I remember correctly. And Harry is not trying to look mere thousands of years into the future.
No, I’m pretty sure Harry thinks the future will hold life to be much more precious than the present does.
As for why, probably bad reasoning, but I wouldn’t hold that against him. Moral progress maybe? The optimism of youth? Because if the future doesn’t hold things that are similar to us but better then it’s a Bad End and probably won’t hold anyone whose opinion we care about?
And yet, we have classics departments.
I suspect Harry will not be disappointed if the future he envisages fails to arrive in a few thousand years.
Erm...there isn’t even conservation of energy in that universe. Do you really think Malthusian economics still holds?
When it becomes possible to cheaply create life, then I expect Malthusian constraints to quickly become tight. (To be more precise, I mean that the long-term population growth rate minus death rate times per capita resource expenditure cannot exceed the resource growth rate.)
Why? In this world, energy is free. Which means, that with sufficient technology, all resources are free. As long as no one recklessly goes around creating resource-using life forms at an incredible rate, we should be fine...
Is it? There’s a big difference between a constraint you’re not sure about and a constraint that doesn’t exist.
Granted, “Voldemort won’t look so bad from a distance” is absurd. But the ultimate decision he made seemed pretty good, as such things go.
He’s an 11 year old, he’s allowed to still be stupid in his idealism.
I agree that it’s absurd to find it probable that the future beings would value all lives, even the lives of the incredibly stupid and evil ancients. It’s conceivable, and plausible, but we don’t have any evidence, and we have lots of bias from fiction.
But this comment doesn’t seem relevant to what Vaniver said. Harry being idealist in no way is related to the probability that future beings will think that Voldemort should or should not have been killed. Your comment makes sense if you’re addressing why Harry came to the conclusions he did, but not if you’re discussing whether or not the sentient creatures would value Voldemort’s life.
My point is that Harry’s beliefs are dumb(getting better, but still dumb), and therefore the conclusions that he draws from those beliefs are silly and poor.
That said, a thought occurs. Doe the True Patronus work for someone willing to do murder(even if it’s to save net lives)?
Hmm, if not, then there’s mutual exclusivity between some Dark Wizard and Light Wizard abilities.
You can cast a Horcrux, OR a True Patronus, but not both. Interestingly, both are ways of avoiding encounters with death.
Which is good, because I’m judging him too.