If you look at the probability of dying by violence, it shows a similar trend
Probability normalizes by population though.
I agree that tail risks are important. What I disagree with is that only tail risks from AGI are important.
My claim is not that the tail risks of AGI are important, my claim is that AGI is a tail risk of technology. Like the correct way to handle tail risks of a broad domain like technology is to perform root cause analysis into narrower factors like “AGI”, “nuclear weapons” vs “speed boats” etc., so you can specifically address the risks of severe stuff like AGI without getting caught up in basic stuff like speed boats.
My claim is not that the tail risks of AGI are important, my claim is that AGI is a tail risk of technology.
Okay, I’m not really sure why we’re talking about this, then.
Consider this post a call to action of the form “please provide reasons why I should update away from the expert-consensus that AGI is probably going to turn out okay”
I agree talking about how we could handle technological changes as a broader framework is a meaningful and useful thing to do. I’m just don’t think it’s related to this post.
My previous comment was in opposition to “handling technological changes as a broader framework”. Like I was saying, you shouldn’t use “technology” broadly as a reference at all, you should consider narrower categories like AGI which individually have high probabilities of being destructive.
narrower categories like AGI which individually have high probabilities of being destructive.
If AGI has a “high probably of being destructive”, show me the evidence. What amazingly compelling argument has led you to have beliefs that are wildly different from the expert-consensus?
Probability normalizes by population though.
My claim is not that the tail risks of AGI are important, my claim is that AGI is a tail risk of technology. Like the correct way to handle tail risks of a broad domain like technology is to perform root cause analysis into narrower factors like “AGI”, “nuclear weapons” vs “speed boats” etc., so you can specifically address the risks of severe stuff like AGI without getting caught up in basic stuff like speed boats.
Okay, I’m not really sure why we’re talking about this, then.
Consider this post a call to action of the form “please provide reasons why I should update away from the expert-consensus that AGI is probably going to turn out okay”
I agree talking about how we could handle technological changes as a broader framework is a meaningful and useful thing to do. I’m just don’t think it’s related to this post.
My previous comment was in opposition to “handling technological changes as a broader framework”. Like I was saying, you shouldn’t use “technology” broadly as a reference at all, you should consider narrower categories like AGI which individually have high probabilities of being destructive.
If AGI has a “high probably of being destructive”, show me the evidence. What amazingly compelling argument has led you to have beliefs that are wildly different from the expert-consensus?
I’ve already posted my argument here, I don’t know why you have dodged responding to it.
my apologizes. that is in a totally different thread, which I will respond to.