I have thoughts, but they are contingent on better understanding what you mean by “types” of hidden information. For example, you used “operating a cocaine dealership” as a “type” of information hidden by the question about health information. Operating a cocaine dealership is not a health matter, except perhaps indirectly if you get high on your own supply.
To further illustrate this ambiguity, a person might be having gay sex in a country where gay sex is criminalized, morally condemned, and viewed as a shocking aberration from normal human behavior. It does not seem to me to be out of integrity for a gay person to refrain from telling other people that they have gay sex in this (or any other) context.
Where this becomes problematic is when the two people have different expectations about what constitutes reasonable expectations and moral behavior. If we give free moral license to choose what to keep private, then it seems to me there is little difference between this onion model and an algorithm for “defining away dishonesty.” One can always justify oneself by saying “other people were foolish for having had such unreasonable expectations as to have been mislead or upset by my nondisclosure.” If “outsiders” are expected to be sufficiently cynical, then the onion model would even justify outright lying and withholding outrageous misbehaviors as within the bounds of “integrity,” as long as other people expected such infractions to be occurring.
It short, it seems that this standard of integrity reduces to “they should have known what they were getting into.”
As such, the onion model seems to rely on a pre-existing consensus on both what is epistemically normal and what is morally right. It is useful as a recipe for producing a summary in this context, but not for dealing with disagreement over what is behaviorally normal or right.
As such, the onion model seems to rely on a pre-existing consensus on both what is epistemically normal and what is morally right. It is useful as a recipe for producing a summary in this context, but not for dealing with disagreement over what is behaviorally normal or right.
I agree. For me it’s more of a characterization of honesty, not integrity (even though I consider honesty an aspect of integrity). Perhaps we should change the name.
I think part of my answer here is “The more a person is a longterm trade partner, the more I invest in them knowing about my inner layers. If it seems like they have different expectations than I do, I’m proactive in sharing information about that.”
Yes, I agree the onion model can be a way to think about summarization and how to prioritize information sharing. I just don’t see it as particularly helpful for integrity.
I have thoughts, but they are contingent on better understanding what you mean by “types” of hidden information. For example, you used “operating a cocaine dealership” as a “type” of information hidden by the question about health information. Operating a cocaine dealership is not a health matter, except perhaps indirectly if you get high on your own supply.
To further illustrate this ambiguity, a person might be having gay sex in a country where gay sex is criminalized, morally condemned, and viewed as a shocking aberration from normal human behavior. It does not seem to me to be out of integrity for a gay person to refrain from telling other people that they have gay sex in this (or any other) context.
Where this becomes problematic is when the two people have different expectations about what constitutes reasonable expectations and moral behavior. If we give free moral license to choose what to keep private, then it seems to me there is little difference between this onion model and an algorithm for “defining away dishonesty.” One can always justify oneself by saying “other people were foolish for having had such unreasonable expectations as to have been mislead or upset by my nondisclosure.” If “outsiders” are expected to be sufficiently cynical, then the onion model would even justify outright lying and withholding outrageous misbehaviors as within the bounds of “integrity,” as long as other people expected such infractions to be occurring.
It short, it seems that this standard of integrity reduces to “they should have known what they were getting into.”
As such, the onion model seems to rely on a pre-existing consensus on both what is epistemically normal and what is morally right. It is useful as a recipe for producing a summary in this context, but not for dealing with disagreement over what is behaviorally normal or right.
I agree. For me it’s more of a characterization of honesty, not integrity (even though I consider honesty an aspect of integrity). Perhaps we should change the name.
I think part of my answer here is “The more a person is a longterm trade partner, the more I invest in them knowing about my inner layers. If it seems like they have different expectations than I do, I’m proactive in sharing information about that.”
Yes, I agree the onion model can be a way to think about summarization and how to prioritize information sharing. I just don’t see it as particularly helpful for integrity.