but it just shows the percentage of years with wars without taking the severity of the wars into account.
If you look at the probability of dying by violence, it shows a similar trend
This stuff is long-tailed, so past average is no indicator of future averages.
I agree that tail risks are important. What I disagree with is that only tail risks from AGI are important. If you wish to convince me that tail-risks from AGI are somehow worse than (nuclear war, killer drone swarms, biological weapons, global warming, etc) you will need evidence. Otherwise, you have simply recreated the weak argument (which I already agree with) “AGI will be different, therefore it could be bad”.
If you look at the probability of dying by violence, it shows a similar trend
Probability normalizes by population though.
I agree that tail risks are important. What I disagree with is that only tail risks from AGI are important.
My claim is not that the tail risks of AGI are important, my claim is that AGI is a tail risk of technology. Like the correct way to handle tail risks of a broad domain like technology is to perform root cause analysis into narrower factors like “AGI”, “nuclear weapons” vs “speed boats” etc., so you can specifically address the risks of severe stuff like AGI without getting caught up in basic stuff like speed boats.
My claim is not that the tail risks of AGI are important, my claim is that AGI is a tail risk of technology.
Okay, I’m not really sure why we’re talking about this, then.
Consider this post a call to action of the form “please provide reasons why I should update away from the expert-consensus that AGI is probably going to turn out okay”
I agree talking about how we could handle technological changes as a broader framework is a meaningful and useful thing to do. I’m just don’t think it’s related to this post.
My previous comment was in opposition to “handling technological changes as a broader framework”. Like I was saying, you shouldn’t use “technology” broadly as a reference at all, you should consider narrower categories like AGI which individually have high probabilities of being destructive.
narrower categories like AGI which individually have high probabilities of being destructive.
If AGI has a “high probably of being destructive”, show me the evidence. What amazingly compelling argument has led you to have beliefs that are wildly different from the expert-consensus?
You link this chart:
… but it just shows the percentage of years with wars without taking the severity of the wars into account.
Your link with genocides includes genocides linked with colonialism, but colonialism seems driven by technological progress to me.
This stuff is long-tailed, so past average is no indicator of future averages. A single event could entirely overwhelm the average.
See also this classical blogpost: https://blog.givewell.org/2015/07/08/has-violence-declined-when-large-scale-atrocities-are-systematically-included/
If you look at the probability of dying by violence, it shows a similar trend
I agree that tail risks are important. What I disagree with is that only tail risks from AGI are important. If you wish to convince me that tail-risks from AGI are somehow worse than (nuclear war, killer drone swarms, biological weapons, global warming, etc) you will need evidence. Otherwise, you have simply recreated the weak argument (which I already agree with) “AGI will be different, therefore it could be bad”.
Probability normalizes by population though.
My claim is not that the tail risks of AGI are important, my claim is that AGI is a tail risk of technology. Like the correct way to handle tail risks of a broad domain like technology is to perform root cause analysis into narrower factors like “AGI”, “nuclear weapons” vs “speed boats” etc., so you can specifically address the risks of severe stuff like AGI without getting caught up in basic stuff like speed boats.
Okay, I’m not really sure why we’re talking about this, then.
Consider this post a call to action of the form “please provide reasons why I should update away from the expert-consensus that AGI is probably going to turn out okay”
I agree talking about how we could handle technological changes as a broader framework is a meaningful and useful thing to do. I’m just don’t think it’s related to this post.
My previous comment was in opposition to “handling technological changes as a broader framework”. Like I was saying, you shouldn’t use “technology” broadly as a reference at all, you should consider narrower categories like AGI which individually have high probabilities of being destructive.
If AGI has a “high probably of being destructive”, show me the evidence. What amazingly compelling argument has led you to have beliefs that are wildly different from the expert-consensus?
I’ve already posted my argument here, I don’t know why you have dodged responding to it.
my apologizes. that is in a totally different thread, which I will respond to.