I have to ask you to imagine the IP security issues of AGI and AGI development. Think both internal and external.
Also, I have to add, what the heck makes you think majoritiarian/political strategies are somehow more effective than local/in-house approaches?
In the end, building fallout shelters is probably silly, but attempting to reduce the risk of nuclear war sure as hell isn’t. And if you do end up worrying about whether a nuclear war is about to happen, remember that if you can reduce the risk of said war—which might be as easy as making a movie—your actions will have a much, much greater overall impact than building a shelter ever could.
er, I would be happy to trade my money for something that can solve societal risks so effectively as to eliminate any necessity of this scale of personal security, but I would have to really see the details on that movie to see what could possibly justify such a hell of an extraordinary claim as, say, somehow making a significant impact on societal-level risks...
I have to ask you to imagine the IP security issues of AGI and AGI development. Think both internal and external.
Are there any non x-risks related examples you could provide? I think sustained discussion of this specific example may be mind-killing given LW local norms.
Also, I have to add, what the heck makes you think majoritiarian/political strategies are somehow more effective than local/in-house approaches?
I would have to really see the details on that movie to see what could possibly justify such a hell of an extraordinary claim as, say, somehow making a significant impact on societal-level risks...
I link to the Wikipedia page that discusses this in the article, but basically that film was seen by over 100 million people, including President Reagan. According to Reagan it was “greatly depressing,” changed his mind on how he should view nuclear war, and ultimately led to new nuclear disarmament treaties that resulted in the dismantling of 2,500+ nuclear weapons.
No, I chose my example because it’s exactly relevant.
I’m not willing to discuss that issue here, so unless you have another example I am withdrawing from the discussion of that point.
What makes you think dismantling the United States nuclear weapons makes you safer?
Please start reading links, they are there for a reason.
The vast majority of weapons dismantled as a result of the treaty were on the Soviet side. Besides, even if you don’t believe that arms reductions make you safer, the film also produced significant outlook changes on the parts of key decision-makers.
Please start reading links, they are there for a reason.
The vast majority of weapons dismantled as a result of the treaty were on the Soviet side. Besides, even if you don’t believe that arms reductions make you safer, the film also produced significant outlook changes on the parts of key decision-makers.
Ok, thanks, but even assuming it was a significant positive impact on societal risk, what in the world makes you think you can reproduce that kind of result? It seems like you kind of left the central point of your post rather unsubstantiated/undefended, to say the least.
I want to see your data!
I can’t predict nuclear war, but there are plenty of solid reasons why the risk of some major catastrophe of some sort is increasing (UFAI being one of them).
EDIT:
after actually reading your post, I think I get what you are saying now, which is this: focus your resources in an optimal utilitarian fashion, esp. e.g. focusing on more likely exisistential risks (UFAI included).
which completely makes sense to me. I’m just arguing that bomb shelters in particular are not necessarily contrary to those interests, so I don’t really like your article as you’ve written it...
Unfortunately, full-scale nuclear war is very likely to impair medicine and science for quite some time, perhaps permanently.
meh, I think you’re underestimating how doable it is to rebuild everything from the ground up. the main problems are political I think. and some planet-destroying full-scale nuclear war is pretty unlikely as far as catastrophes go anyway. Remember, most people don’t actually want to destroy the world.
I have to ask you to imagine the IP security issues of AGI and AGI development. Think both internal and external.
Also, I have to add, what the heck makes you think majoritiarian/political strategies are somehow more effective than local/in-house approaches?
er, I would be happy to trade my money for something that can solve societal risks so effectively as to eliminate any necessity of this scale of personal security, but I would have to really see the details on that movie to see what could possibly justify such a hell of an extraordinary claim as, say, somehow making a significant impact on societal-level risks...
Are there any non x-risks related examples you could provide? I think sustained discussion of this specific example may be mind-killing given LW local norms.
Multiplication.
I link to the Wikipedia page that discusses this in the article, but basically that film was seen by over 100 million people, including President Reagan. According to Reagan it was “greatly depressing,” changed his mind on how he should view nuclear war, and ultimately led to new nuclear disarmament treaties that resulted in the dismantling of 2,500+ nuclear weapons.
No, I chose my example because it’s exactly relevant.
I’m not disagreeing that we need an optimal utilitarian solution. I’m arguing that your thesis here fails toward that end in general.
What makes you think dismantling the United States’ nuclear weapons makes you safer?
I’m not willing to discuss that issue here, so unless you have another example I am withdrawing from the discussion of that point.
Please start reading links, they are there for a reason.
The vast majority of weapons dismantled as a result of the treaty were on the Soviet side. Besides, even if you don’t believe that arms reductions make you safer, the film also produced significant outlook changes on the parts of key decision-makers.
Ok, thanks, but even assuming it was a significant positive impact on societal risk, what in the world makes you think you can reproduce that kind of result? It seems like you kind of left the central point of your post rather unsubstantiated/undefended, to say the least.
I can’t predict nuclear war, but there are plenty of solid reasons why the risk of some major catastrophe of some sort is increasing (UFAI being one of them).
EDIT:
after actually reading your post, I think I get what you are saying now, which is this: focus your resources in an optimal utilitarian fashion, esp. e.g. focusing on more likely exisistential risks (UFAI included).
which completely makes sense to me. I’m just arguing that bomb shelters in particular are not necessarily contrary to those interests, so I don’t really like your article as you’ve written it...
meh, I think you’re underestimating how doable it is to rebuild everything from the ground up. the main problems are political I think. and some planet-destroying full-scale nuclear war is pretty unlikely as far as catastrophes go anyway. Remember, most people don’t actually want to destroy the world.
http://i.imgur.com/MApP3Ff.png