Having the wrong experts on AI risk cited in the article at a critical junction where the public develops an understanding of the issue can result in people getting killed.
If it shifts the probability of an UFAI disaster even by 0.001% that equals over a thousands lives saved. It probably a bigger effect than the 5 people who safe by pushing the fat man.
The moral cost you pay by pushing the fat man is higher than the moral cost of violating Wikipedia norms. The benefit of getting the narrative on the article right about AI risk is probably much more valuable than the handful of people you safe in the trolly example.
If it shifts the probability of an UFAI disaster even by 0.001% that equals over a thousands lives saved. It probably a bigger effect than the 5 people who safe by pushing the fat man.
That kind of makes me wonder what would you do in a situation depicted in the movie (and even if you wouldn’t, the more radical elements here who do not discuss their ideas online any more would).
There’s even a chance that terrorists in the movie are led by an uneducated fear-mongering crackpot who primes them with invalid expected utility calculations and trolley problems.
Having the wrong experts on AI risk cited in the article at a critical junction where the public develops an understanding of the issue can result in people getting killed.
The world’s better at determining who the right experts are when conflict-of-interest rules are obeyed.
Having the wrong experts on AI risk cited in the article at a critical junction where the public develops an understanding of the issue can result in people getting killed.
If it shifts the probability of an UFAI disaster even by 0.001% that equals over a thousands lives saved. It probably a bigger effect than the 5 people who safe by pushing the fat man.
The moral cost you pay by pushing the fat man is higher than the moral cost of violating Wikipedia norms. The benefit of getting the narrative on the article right about AI risk is probably much more valuable than the handful of people you safe in the trolly example.
That kind of makes me wonder what would you do in a situation depicted in the movie (and even if you wouldn’t, the more radical elements here who do not discuss their ideas online any more would).
There’s even a chance that terrorists in the movie are led by an uneducated fear-mongering crackpot who primes them with invalid expected utility calculations and trolley problems.
The world’s better at determining who the right experts are when conflict-of-interest rules are obeyed.