The Moral of the Story seems to be Harry finding an answer to the weakness, stupidity, and evil of others besides >hating them and destroying them.
EY has made it his life goal to creating an artificial intelligence that is friendly to humans. A mind that transcends us without hating us. Harry MUST triumph over Quirrel, and he must do so by being more moral, not more intelligent. Because if Harry wins by being smarter, then EY would be conceding that morality is a weakness, or at the very least that strength and strength alone will determine which AI will win. And there would always be that risk that the AI would “grow up” as Quirrel puts it, and realize that “the reason it is easy for you to forgive such fools and think well of them, Mr. Potter, is that you yourself have not been sorely hurt”. And something tells me that EY’s solution is not to create a being that can’t be hurt.
My guess is that “The power that the dark lord knows not” is, in some way, a solution to this problem. Harry will triumph for the same reason EY’s friendly AI will (supposedly) triumph. But we will see. I haven’t read enough of EY’s stuff on friendly AI to know for certain what his solution to the AI problem is, only that he thinks he has one.
Harry MUST triumph over Quirrel, and he must do so by being more moral, not more intelligent.
That doesn’t sound right. If you’re looking for ways Harry could win, why not take Harry’s advice and draw up a list of his relative advantages? He does have them—knowledge of superrationality, knowledge of science, ability to empathize with non-psychopaths, to name three—and they’re likely to be part of the solution.
then EY would be conceding that morality is a weakness, or at the very least that strength and strength alone will determine which AI will win.
I’m pretty sure that he does believe that if an AI goes FOOM, it’s going to win, period, moral or no. The idea that an AI would not simply be more preferable, but actually win over another AI on account of being more moral strikes me as, well, rather silly, and not at all in accordance with what I think Eliezer actually believes.
As of last week Eliezer didn’t have any plans to include an allegory to FAI, and expected any such allegory to work very badly in story terms (“suck like a black hole”).
EY has made it his life goal to creating an artificial intelligence that is friendly to humans. A mind that transcends us without hating us. Harry MUST triumph over Quirrel, and he must do so by being more moral, not more intelligent. Because if Harry wins by being smarter, then EY would be conceding that morality is a weakness, or at the very least that strength and strength alone will determine which AI will win. And there would always be that risk that the AI would “grow up” as Quirrel puts it, and realize that “the reason it is easy for you to forgive such fools and think well of them, Mr. Potter, is that you yourself have not been sorely hurt”. And something tells me that EY’s solution is not to create a being that can’t be hurt.
My guess is that “The power that the dark lord knows not” is, in some way, a solution to this problem. Harry will triumph for the same reason EY’s friendly AI will (supposedly) triumph. But we will see. I haven’t read enough of EY’s stuff on friendly AI to know for certain what his solution to the AI problem is, only that he thinks he has one.
That doesn’t sound right. If you’re looking for ways Harry could win, why not take Harry’s advice and draw up a list of his relative advantages? He does have them—knowledge of superrationality, knowledge of science, ability to empathize with non-psychopaths, to name three—and they’re likely to be part of the solution.
I’m pretty sure that he does believe that if an AI goes FOOM, it’s going to win, period, moral or no. The idea that an AI would not simply be more preferable, but actually win over another AI on account of being more moral strikes me as, well, rather silly, and not at all in accordance with what I think Eliezer actually believes.
As of last week Eliezer didn’t have any plans to include an allegory to FAI, and expected any such allegory to work very badly in story terms (“suck like a black hole”).
For the reference of other readers
Oh. I feel a little silly now.