In your first example, it’s clear that expected loss is as important as intent. It’s not just that you probably don’t have a strong intent to misuse their data. It’s that the cost of you actually having this intent is pretty small when you only have access to whatever data is left on your laptop, compared to when you had access to a live production database or whatever. In other words, it’s not that they necessarily have some sort of binary model of intent to screw them. Even if they have some sort of distribution of the likelihood that either now or soon you will want to misuse the data, it doesn’t matter because the expected loss is so small in any model.
To impute a binary model of intent to screw them, they’d have to do something like this: Previously we had lots of people who just had root access to our production environment. We now want to tighten up and give everyone permission sets tailored to their actual job requirements. However, silentbob has been with us for a while and would have screwed us by now if that was their intent, so let’s just let them keep their old root account since that’s slightly less work and might slightly improve productivity.
In your second example, I’m involuntarily screaming NO NO NO DO NOT MAKE THAT CHANGE WHAT THE FUCK IS WRONG WITH YOU before I even get to their reasoning. By the time I’ve read their reasoning I already want them fired. When you report the results of the A/B test data, I’m thinking: well of course the data showed that. How could you possibly think anything else could happen?
I’m starting to think I’ve been programming for too long. Like, I didn’t even have to think to know what would actually happen. I just felt it immediately.
In your third example, I don’t think that’s how gun enthusiasts usually reason about this point, but I respect that this isn’t really what you’re getting at.
In your first example, it’s clear that expected loss is as important as intent. It’s not just that you probably don’t have a strong intent to misuse their data. It’s that the cost of you actually having this intent is pretty small when you only have access to whatever data is left on your laptop, compared to when you had access to a live production database or whatever. In other words, it’s not that they necessarily have some sort of binary model of intent to screw them. Even if they have some sort of distribution of the likelihood that either now or soon you will want to misuse the data, it doesn’t matter because the expected loss is so small in any model.
To impute a binary model of intent to screw them, they’d have to do something like this: Previously we had lots of people who just had root access to our production environment. We now want to tighten up and give everyone permission sets tailored to their actual job requirements. However, silentbob has been with us for a while and would have screwed us by now if that was their intent, so let’s just let them keep their old root account since that’s slightly less work and might slightly improve productivity.
In your second example, I’m involuntarily screaming NO NO NO DO NOT MAKE THAT CHANGE WHAT THE FUCK IS WRONG WITH YOU before I even get to their reasoning. By the time I’ve read their reasoning I already want them fired. When you report the results of the A/B test data, I’m thinking: well of course the data showed that. How could you possibly think anything else could happen?
I’m starting to think I’ve been programming for too long. Like, I didn’t even have to think to know what would actually happen. I just felt it immediately.
In your third example, I don’t think that’s how gun enthusiasts usually reason about this point, but I respect that this isn’t really what you’re getting at.