I like to think I’m trustworthy, and people (in person, at least) seem willing to confide in me and trust me to keep secrets. However, I’m not a contractualist (or other form of deontologist) at heart, and there probably exist circumstances where I deem it better for the world to break a promise (or oath, or other very serious contract) than to keep it.
No contract or promise can possibly be as complex as the universe. I just don’t know all the circumstances that will require me to test or use my knowledge. My promise of confidentiality always has exceptions, even though I very rarely state them. So do yours—unless you’re specifically trained, you won’t resist torture, for a boring example.
I don’t think explicit manipulation or deception is my primary concern—it happens, but I usually think it hurts them more than me. I worry a lot more about non-adversarial incorrect beliefs or models—secrets without any boundaries and without specifics of what the consequences of sharing would be (taking the infohazard elements seriously, in order to make good risk decision) tend to be difficult to approach rationally.
I like to think I’m trustworthy, and people (in person, at least) seem willing to confide in me and trust me to keep secrets. However, I’m not a contractualist (or other form of deontologist) at heart, and there probably exist circumstances where I deem it better for the world to break a promise (or oath, or other very serious contract) than to keep it.
No contract or promise can possibly be as complex as the universe. I just don’t know all the circumstances that will require me to test or use my knowledge. My promise of confidentiality always has exceptions, even though I very rarely state them. So do yours—unless you’re specifically trained, you won’t resist torture, for a boring example.
I don’t think explicit manipulation or deception is my primary concern—it happens, but I usually think it hurts them more than me. I worry a lot more about non-adversarial incorrect beliefs or models—secrets without any boundaries and without specifics of what the consequences of sharing would be (taking the infohazard elements seriously, in order to make good risk decision) tend to be difficult to approach rationally.