I think it is the issue with moral responsibility, same way as with self-driving cars. People don’t want a decision who may negatively affect a person be based on an algorithm because an algorithm is not a moral agent. They want some expert to stand up and say “I risk my professional prestige and blame and all that and declare this person to be prone to violence and thus needs to have a restraining order issued, and if I am wrong it is my bad”. As my dad used to say it is all about who is willing to put their dick in the cigar cutter? To accept responsibility, blame, even punishment for a decision made that affects others?
Part of it is rational: decision makers having skin in the game makes decisions better. See Taleb, Anti-Fragile.
Part of it is simply we being used to or evolved to thinking without responsibility there cannot be good decisions, which is true as long as humans make them. We are not evolved to deal with algorithms.
I think it is the issue with moral responsibility, same way as with self-driving cars. People don’t want a decision who may negatively affect a person be based on an algorithm because an algorithm is not a moral agent. They want some expert to stand up and say “I risk my professional prestige and blame and all that and declare this person to be prone to violence and thus needs to have a restraining order issued, and if I am wrong it is my bad”. As my dad used to say it is all about who is willing to put their dick in the cigar cutter? To accept responsibility, blame, even punishment for a decision made that affects others?
Part of it is rational: decision makers having skin in the game makes decisions better. See Taleb, Anti-Fragile.
Part of it is simply we being used to or evolved to thinking without responsibility there cannot be good decisions, which is true as long as humans make them. We are not evolved to deal with algorithms.