I know this is mostly a philisophical take on Kant, but its interesting that these kind of issues will probably come up soon (if not already) in things like GPS apps and self driving cars. Lets say that some very significant percentage of road users use your GPS app to give them directions while driving. Do you optimise each route individually? Or do you divert the driver you know to drive slowly on single-lane country roads out of the way for your other users? If you are a dating app that really, really trusts its matching software maybe you partner one person with their second-best match, because their first-best match has good compatiblity with lots of people but their second-place match is only compatible with them. I suppose both of these examples fall foul of the “don’t ever screw over anyone” principle—although I am suspicious of that principle because to some extent telling somone to co-operate in a prisoners dillema game is screwing them over.
I know this is mostly a philisophical take on Kant, but its interesting that these kind of issues will probably come up soon (if not already) in things like GPS apps and self driving cars. Lets say that some very significant percentage of road users use your GPS app to give them directions while driving. Do you optimise each route individually? Or do you divert the driver you know to drive slowly on single-lane country roads out of the way for your other users? If you are a dating app that really, really trusts its matching software maybe you partner one person with their second-best match, because their first-best match has good compatiblity with lots of people but their second-place match is only compatible with them. I suppose both of these examples fall foul of the “don’t ever screw over anyone” principle—although I am suspicious of that principle because to some extent telling somone to co-operate in a prisoners dillema game is screwing them over.