https://www.newyorker.com/science/maria-konnikova/hazards-automation I haven’t gone deep into these studies, but I am aware that there have been claims made that large percentage but incomplete automation can have negative consequences because the human operator does not get enough practice to really be effective during the times she needs to take over. Especially in rare high leverage situations.
Anecdotally, I work in software development for a company that has a lot of services. A service that is not 100% resilient to incomplete/missing data at start up gets restarted any time its key data changes. There is no point to making something 90% resilient.
https://www.newyorker.com/science/maria-konnikova/hazards-automation I haven’t gone deep into these studies, but I am aware that there have been claims made that large percentage but incomplete automation can have negative consequences because the human operator does not get enough practice to really be effective during the times she needs to take over. Especially in rare high leverage situations.
Anecdotally, I work in software development for a company that has a lot of services. A service that is not 100% resilient to incomplete/missing data at start up gets restarted any time its key data changes. There is no point to making something 90% resilient.