Must a brain-in-a-vat which controls a factory machinery having a simple, pre-determined task, necessarily be self-aware? Because if it’s not, than it’s no more slavery than the case of a contemporary factory robot, or a horse pulling a cart.
I’d agree that the brains of very primitive animals, or brains that have been heavily stripped down specifically to, say, operate a traffic light, aren’t really worthy of moral consideration. But you’d probably need more intelligent brains for complex tasks like building cars or flying planes, and those probably are worthy of moral consideration—stripping out sapience while leaving sufficient intelligence might be impossible or expensive.
Must a brain-in-a-vat which controls a factory machinery having a simple, pre-determined task, necessarily be self-aware? Because if it’s not, than it’s no more slavery than the case of a contemporary factory robot, or a horse pulling a cart.
I’d agree that the brains of very primitive animals, or brains that have been heavily stripped down specifically to, say, operate a traffic light, aren’t really worthy of moral consideration. But you’d probably need more intelligent brains for complex tasks like building cars or flying planes, and those probably are worthy of moral consideration—stripping out sapience while leaving sufficient intelligence might be impossible or expensive.