First, laws are needed to determine whether the designer, the programmer, the manufacturer or the operator is at fault if an autonomous drone strike goes wrong or a driverless car has an accident. In order to allocate responsibility, autonomous systems must keep detailed logs so that they can explain the reasoning behind their decisions when necessary. This has implications for system design: it may, for instance, rule out the use of artificial neural networks, decision-making systems that learn from example rather than obeying predefined rules.
I think this is nonsense. We have a current legal system without “detailed logs”. Humans can still attribute blame without them. A need for logs doesn’t rule out the use of neural networks.
I think this is nonsense. We have a current legal system without “detailed logs”. Humans can still attribute blame without them. A need for logs doesn’t rule out the use of neural networks.