“Safety” means avoiding certain bad outcomes. By using the word “safety”, you’re trying to sneak past us the assumption “humans remaining the dominant lifeform = good, humans not remaining dominant = bad”.
The argument should be over what humans have that is valuable, and how we can contribute that to the future. Not over how humans can survive.
Agreed. People seem to get hold of the idea that humans are good, and machines are bad, and then get into an us vs them mindset. Surely all the best possible futures involve an engineered world, where the agony of being a meat brained human who was cobbled together by natural selection is mostly a distant memory.
But we have to keep the humans around until humans are capable of engineering that world carefully and without screwing it up. If we don’t engineer it, who will?
Right. There are pretty good instrumental reasons for all the parties concerned to do that. Humans may also be useful for a while for rebooting the system—if there is a major setback. They have successfully booted things up once already. Other backup systems are likely to be less well tested.
Agreed. People seem to get hold of the idea that humans are good, and machines are bad, and then get into an us vs them mindset. Surely all the best possible futures involve an engineered world, where the agony of being a meat brained human who was cobbled together by natural selection is mostly a distant memory.
But we have to keep the humans around until humans are capable of engineering that world carefully and without screwing it up. If we don’t engineer it, who will?
Right. There are pretty good instrumental reasons for all the parties concerned to do that. Humans may also be useful for a while for rebooting the system—if there is a major setback. They have successfully booted things up once already. Other backup systems are likely to be less well tested.