Your writing style feels very disjointed. You keep switching arguments without making it clear when you’re starting from a new direction. With that said, I’ve had similar thoughts to paragraph 2 (the superintelligence section) before. Indeed, slaughtering an animal is such a huge ethical decision that I think it relates to a ton of other major moral decisions; how will superintelligences react to us being just one such example.
To reverse that argument, how should we treat a superintelligence? The huge separation most create between animals and man implies we should treat them with an almost reverence, but the widespread belief in human equality implies almost no distinction between us (this is my view). Strangely, I suspect a ton of people believe we should treat them as beneath humans. People seem to be more convinced by “the AI-box wouldn’t work” arguments than “the AI-box amounts to slavery” arguments. I’m not sure how to make this fit.
My solution: As intelligence increases, so too does the danger they impose, and therefore SIs should have significantly fewer rights. However, this decrease is so gradual that no natural human will ever require a curtailing of their rights. Later genetically engineered and cybernetically enhanced humans will be a different story. I don’t believe this at all, so I’m open to alternatives. Does anybody here believe SIs should have fewer rights than humans?
Your writing style feels very disjointed. You keep switching arguments without making it clear when you’re starting from a new direction. With that said, I’ve had similar thoughts to paragraph 2 (the superintelligence section) before. Indeed, slaughtering an animal is such a huge ethical decision that I think it relates to a ton of other major moral decisions; how will superintelligences react to us being just one such example.
To reverse that argument, how should we treat a superintelligence? The huge separation most create between animals and man implies we should treat them with an almost reverence, but the widespread belief in human equality implies almost no distinction between us (this is my view). Strangely, I suspect a ton of people believe we should treat them as beneath humans. People seem to be more convinced by “the AI-box wouldn’t work” arguments than “the AI-box amounts to slavery” arguments. I’m not sure how to make this fit.
My solution: As intelligence increases, so too does the danger they impose, and therefore SIs should have significantly fewer rights. However, this decrease is so gradual that no natural human will ever require a curtailing of their rights. Later genetically engineered and cybernetically enhanced humans will be a different story. I don’t believe this at all, so I’m open to alternatives. Does anybody here believe SIs should have fewer rights than humans?