This isn’t a problem with the tool, but the protocols and practices surrounding it.
Joe, I couldn’t discern any verifiable meaning in this sentence. Both the tool and the protocols/practices contribute to the problem.
You can’t possibly think that all of our problems would go away if you gave everybody in the world a lobotomy?
No more nukes. No more surveillance states. No more military/police robots. No more AI threat, nanotech threat, biotech threat. A lobotomy for everyone would increase our chances of surviving the century as a species.
I guess most people would object to an intelligence decrease because they feel it would make them worse off. In a world where growth of intelligence and knowledge leads to multiple high-probability scenarios of global collapse, this stance looks eerily equivalent to defecting in the Prisoner’s Dilemma. I wonder what Eliezer, an outspoken proponent of cooperating, would say about global lobotomies viewed in this light.
Devil’s advocate mode!
Joe, I couldn’t discern any verifiable meaning in this sentence. Both the tool and the protocols/practices contribute to the problem.
No more nukes. No more surveillance states. No more military/police robots. No more AI threat, nanotech threat, biotech threat. A lobotomy for everyone would increase our chances of surviving the century as a species.
I guess most people would object to an intelligence decrease because they feel it would make them worse off. In a world where growth of intelligence and knowledge leads to multiple high-probability scenarios of global collapse, this stance looks eerily equivalent to defecting in the Prisoner’s Dilemma. I wonder what Eliezer, an outspoken proponent of cooperating, would say about global lobotomies viewed in this light.