You make the mistake of equating something being generally banned and it not happening. Selling MDMA is generally banned. On the other hand it’s still possible to purchase it in many places.
As a stronger argument to your point—In Australia nearly no one owns guns; its very difficult to get guns and I certainly know of no-one who has one. However I am completely confident that I can call my shadiest friend and he could call his shadiest of friend (and possibly to a 3rd degree—his friend) and within 7 days I could have a gun for the low-low price of “some monetary compensation”.
The rural Australia figure is for number of people, not number of guns. But when you’re comparing it to America, you’re comparing it to number of guns. This compares apples and oranges.
My thought is that (just as the FAI problem) the problem requires an invention, namely a way to engineer the world order such that this ban is effective (for example by fundamentally altering culture and traditions, by using mass surveillance, by reversing the development and restricting the fabrication of computational resources, or by highly regulating the access to certain commodities and resources required for computation such as electricity and silicon).
Let’s steelman his argument into “Which is more likely to succeed, actually stopping all research associated with existential risk or inventing a Friendly AI?”. If you find another reason why the first option wouldn’t work, include the desperate effort needed to overcome that problem in the calculation.
Me minutes after writing that: “I precommit to post this at most a week from now. I predict someone will give a clever answer along the lines of driving humanity extinct in order to stop existential risk research.”
You make the mistake of equating something being generally banned and it not happening. Selling MDMA is generally banned. On the other hand it’s still possible to purchase it in many places.
As a stronger argument to your point—In Australia nearly no one owns guns; its very difficult to get guns and I certainly know of no-one who has one. However I am completely confident that I can call my shadiest friend and he could call his shadiest of friend (and possibly to a 3rd degree—his friend) and within 7 days I could have a gun for the low-low price of “some monetary compensation”.
I’m sure some people in rural areas do. Wiki says:
And that’s only people who legally own guns, of course.
okay yes rural guns exist. That still leaves 20million+ of population without access. Compared to america where there are more guns than people...
The rural Australia figure is for number of people, not number of guns. But when you’re comparing it to America, you’re comparing it to number of guns. This compares apples and oranges.
certainly; this pointless tangent is becoming more of a statement about gun culture than about banning substances.
The fact that bans have a poor track record in human history does not imply that they are impossible, does it?
My thought is that (just as the FAI problem) the problem requires an invention, namely a way to engineer the world order such that this ban is effective (for example by fundamentally altering culture and traditions, by using mass surveillance, by reversing the development and restricting the fabrication of computational resources, or by highly regulating the access to certain commodities and resources required for computation such as electricity and silicon).
“I take over the world and create to create a unified totalitarian state” is a solution that comes with it’s own existential risks.
Let’s steelman his argument into “Which is more likely to succeed, actually stopping all research associated with existential risk or inventing a Friendly AI?”. If you find another reason why the first option wouldn’t work, include the desperate effort needed to overcome that problem in the calculation.
I don’t think “existential risk research” and “research associated with existential risks” are the same thing.
Yes, that’s what I meant. Let me edit that.
Me minutes after writing that: “I precommit to post this at most a week from now. I predict someone will give a clever answer along the lines of driving humanity extinct in order to stop existential risk research.”