Intentions and having already outright declared what ‘shall not be’ (see dementor scene) are sufficient for at least establishing what possible futures Harry cares about and plans on happening. (I personally criticized said scene because it seemed like cheap overconfident grandstanding of the kind fitting to an 11 year old.)
I’m confused is this supposed to be a criticism of the writing or of Harry?
Harry, to a certain extent. As well as not the writing per se, but the brand of transhumanist bluster written about. Where some we emotionally roused to cheering, I cringed.
Vaniver was talking in terms of predictions about what the future people would think. You responded in terms of what Harry wanted to happen. Unless you’re making a statement about Harry suffering from some form of bias here (in which case your comments are unclear) then Vaniver is right. Intentions have no effect on what the future actually will be. Predicting what will happen in the long term is different then planning to cause something to happen in the short term.
Vaniver was talking in terms of predictions about what the future people would think. You responded in terms of what Harry wanted to happen.
Vaniver was talking about Harry’s evaluation of the future outcomes. Once again, I point out Harry’s forceful and unambiguous declarations to the dementor about what the future ‘shall’ be and assert relevance of that kind of thinking to how Harry would evaluate the thoughts of of the people he labels as those from the future.
Intentions have no effect on what the future actually will be.
I’ve heard about a particular neurological condition (typically caused by traumatic brain injury) where this is the case. For the rest of us intentions do have an effect. (That’s kind of the main reason we have them in the first place.)
Vaniver wasn’t talking about Harry’s evaluation of future outcomes, he was talking about Harry’s predictions of future thoughts that future people would have. That’s why Vaniver said “why does he think the future will hold life to be precious”, etc. “He think the future will” clearly refers to a prediction made by Harry.
So your response is only relevant if you were trying to say Harry’s predictions were tainted by his value judgements. But I don’t think that’s what you were saying, correct?
Intentions have no impact on the future, only actions do. Unless you want to pretend that the neurons firing around in your brain are causally significant (in terms of effects to the outside world) in any substantive way, which would be dumb. Harry “declaring” that he considers death unacceptable and intends to stop it is “insufficient” to cause immortality. He would need to take actions like making an immortality pill and giving it to everyone, or something.
Vaniver wasn’t talking about Harry’s evaluation of future outcomes, he was talking about Harry’s predictions of future thoughts that future people would have. That’s why Vaniver said “why does he think the future will hold life to be precious”, etc. “He think the future will” clearly refers to a prediction made by Harry.
I believe you are incorrectly modelling the way Harry thinks and misunderstand the implications of the words Harry has uttered. The implicit prediction is conditional. On, for example, not catastrophic failure and extinction. To illustrate the position: Harry would not change the thinking here or the degree to which his meaning is valid if he happened to believe that there was a 95% chance of human extinction instead of any possible evaluation of future humans.
So your response is only relevant if you were trying to say Harry’s predictions were tainted by his value judgements. But I don’t think that’s what you were saying, correct?
That is not my primary point. I would perhaps also say that this is likely. Or at least that he uses overconfident rhetoric when expressing himself, to a degree that my instincts warn me to disaffiliate.
Intentions have no impact on the future, only actions do. Unless you want to pretend that the neurons firing around in your brain are causally significant (in terms of effects to the outside world) in any substantive way, which would be dumb.
I assert the thing that you say is dumb. My model of causality doesn’t consider atoms inside the computational structure of powerful optimization agents to be qualitatively different in causal significance to atoms outside of such entities. Neurons firing around in powerful brains are among the most causally significant things in existence.
Because he has no intention of letting that happen.
Intentions are insufficient.
Intentions and having already outright declared what ‘shall not be’ (see dementor scene) are sufficient for at least establishing what possible futures Harry cares about and plans on happening. (I personally criticized said scene because it seemed like cheap overconfident grandstanding of the kind fitting to an 11 year old.)
I’m confused is this supposed to be a criticism of the writing or of Harry?
Harry, to a certain extent. As well as not the writing per se, but the brand of transhumanist bluster written about. Where some we emotionally roused to cheering, I cringed.
Vaniver was talking in terms of predictions about what the future people would think. You responded in terms of what Harry wanted to happen. Unless you’re making a statement about Harry suffering from some form of bias here (in which case your comments are unclear) then Vaniver is right. Intentions have no effect on what the future actually will be. Predicting what will happen in the long term is different then planning to cause something to happen in the short term.
Vaniver was talking about Harry’s evaluation of the future outcomes. Once again, I point out Harry’s forceful and unambiguous declarations to the dementor about what the future ‘shall’ be and assert relevance of that kind of thinking to how Harry would evaluate the thoughts of of the people he labels as those from the future.
I’ve heard about a particular neurological condition (typically caused by traumatic brain injury) where this is the case. For the rest of us intentions do have an effect. (That’s kind of the main reason we have them in the first place.)
Vaniver wasn’t talking about Harry’s evaluation of future outcomes, he was talking about Harry’s predictions of future thoughts that future people would have. That’s why Vaniver said “why does he think the future will hold life to be precious”, etc. “He think the future will” clearly refers to a prediction made by Harry.
So your response is only relevant if you were trying to say Harry’s predictions were tainted by his value judgements. But I don’t think that’s what you were saying, correct?
Intentions have no impact on the future, only actions do. Unless you want to pretend that the neurons firing around in your brain are causally significant (in terms of effects to the outside world) in any substantive way, which would be dumb. Harry “declaring” that he considers death unacceptable and intends to stop it is “insufficient” to cause immortality. He would need to take actions like making an immortality pill and giving it to everyone, or something.
I believe you are incorrectly modelling the way Harry thinks and misunderstand the implications of the words Harry has uttered. The implicit prediction is conditional. On, for example, not catastrophic failure and extinction. To illustrate the position: Harry would not change the thinking here or the degree to which his meaning is valid if he happened to believe that there was a 95% chance of human extinction instead of any possible evaluation of future humans.
That is not my primary point. I would perhaps also say that this is likely. Or at least that he uses overconfident rhetoric when expressing himself, to a degree that my instincts warn me to disaffiliate.
I assert the thing that you say is dumb. My model of causality doesn’t consider atoms inside the computational structure of powerful optimization agents to be qualitatively different in causal significance to atoms outside of such entities. Neurons firing around in powerful brains are among the most causally significant things in existence.
Erm...there isn’t even conservation of energy in that universe. Do you really think Malthusian economics still holds in such a world?