I guess this has come up before, but I take it the reason to be Voldemort is that as soon as muggles get load of magic, they’ll figure out how become magical, transmute 3 stage thermonuclear devices from concrete, apparate them over cities, etc. So magic means the total removal of all technological or economic restrictions on nuclear warfare. And time travel.
So if you figured the muggles would discover the magical world pretty soon, and if you wanted there to be any people at all in the future, you’d have to make the society of magical knowledge completely closed. This means taking over, at least, the magical world and probably the muggle one too. And in order to prevent anyone from seeing magic as technology and doing productive research on it, you’d have to make it completely scary, so that their fear and moral hatred would override their ability to study it proficiently.
If that’s true, then muggle science is similar a soon-to-be-uncontrollable AI (it is at least by many orders of magnitude a better optimizing system then the magical world’s own research efforts), and Voldemort is a last ditch effort at reboxing. If that’s right, it seems hard to argue with Voldemort.
I think people in the Less Wrong community are a little too fast to analogize any existential threat to the threat of rogue AI. The threat of people blowing up the world with nuclear weapons seems a lot more analogous to the threat of people blowing up the world with nuclear weapons.
I guess this has come up before, but I take it the reason to be Voldemort is that as soon as muggles get load of magic, they’ll figure out how become magical, transmute 3 stage thermonuclear devices from concrete, apparate them over cities, etc. So magic means the total removal of all technological or economic restrictions on nuclear warfare. And time travel.
So if you figured the muggles would discover the magical world pretty soon, and if you wanted there to be any people at all in the future, you’d have to make the society of magical knowledge completely closed. This means taking over, at least, the magical world and probably the muggle one too. And in order to prevent anyone from seeing magic as technology and doing productive research on it, you’d have to make it completely scary, so that their fear and moral hatred would override their ability to study it proficiently.
If that’s true, then muggle science is similar a soon-to-be-uncontrollable AI (it is at least by many orders of magnitude a better optimizing system then the magical world’s own research efforts), and Voldemort is a last ditch effort at reboxing. If that’s right, it seems hard to argue with Voldemort.
I think people in the Less Wrong community are a little too fast to analogize any existential threat to the threat of rogue AI. The threat of people blowing up the world with nuclear weapons seems a lot more analogous to the threat of people blowing up the world with nuclear weapons.