Can I double click on: “I don’t think it’s game-theoretically sound”?
And I don’t know enough about the Islamic merchants to answer that question. I guess the notes where countersigned along the way and so on. But true, it does sound a bit outlandish to modern ears (which is why I thought it’d make a cute opening).
If the person you are dealing with is willing to do the right thing even if it costs them something, a lot of different systems will work.
If you do not trust the person you are dealing with to do the right thing in the absence of incentives, you want a system that will impose incentives that ensure it is in their interest to do the right thing. I’m calling this ‘game-theoretically sound’ to mean ‘the system accomplishes the intended goal even when one or both parties are the kinds of sociopath that prevail in game theory problems’.
This could be formal law (do your end of the contract or the government will punish you).
It could be informal threats (uphold the bargain or me and my mates will clobber you).
In some circumstances it could be reputation (honor your note or no one will deal with you again? refund my purchase or I will review you badly online?).
However, for reputation to actually work as a system that is protected against adversaries/sociopaths (as distinct from working as a system for nice people who are already pretty much trustworthy), you need the damages caused by reputation to exceed the benefits gained by defecting. This is plausibly true for ancient merchants living in small societies, or for companies that have large numbers of customers. I don’t think it’s true for your networks of trust.
That is not necessarily a problem for you! If you’re just dealing with your neighbor, you don’t need your system to be defensible against sociopaths. But if you want to scale up your system, it will become more and more relevant.
Its true, building a network with high barriers to entry is hard to scale, and will never compare to the scalability of a functioning justice system or something like blockchain. And it relies a lot on weeding out sociopaths to function; though the value of belonging to a high trust network can work as an incentive to play fair and not get excluded.
Good points.
Can I double click on: “I don’t think it’s game-theoretically sound”?
And I don’t know enough about the Islamic merchants to answer that question. I guess the notes where countersigned along the way and so on. But true, it does sound a bit outlandish to modern ears (which is why I thought it’d make a cute opening).
Attempted expansion:
If the person you are dealing with is willing to do the right thing even if it costs them something, a lot of different systems will work.
If you do not trust the person you are dealing with to do the right thing in the absence of incentives, you want a system that will impose incentives that ensure it is in their interest to do the right thing. I’m calling this ‘game-theoretically sound’ to mean ‘the system accomplishes the intended goal even when one or both parties are the kinds of sociopath that prevail in game theory problems’.
This could be formal law (do your end of the contract or the government will punish you).
It could be informal threats (uphold the bargain or me and my mates will clobber you).
In some circumstances it could be reputation (honor your note or no one will deal with you again? refund my purchase or I will review you badly online?).
However, for reputation to actually work as a system that is protected against adversaries/sociopaths (as distinct from working as a system for nice people who are already pretty much trustworthy), you need the damages caused by reputation to exceed the benefits gained by defecting. This is plausibly true for ancient merchants living in small societies, or for companies that have large numbers of customers. I don’t think it’s true for your networks of trust.
That is not necessarily a problem for you! If you’re just dealing with your neighbor, you don’t need your system to be defensible against sociopaths. But if you want to scale up your system, it will become more and more relevant.
Hope this is clearer, apologies for length.
I appreciated the length!
Its true, building a network with high barriers to entry is hard to scale, and will never compare to the scalability of a functioning justice system or something like blockchain. And it relies a lot on weeding out sociopaths to function; though the value of belonging to a high trust network can work as an incentive to play fair and not get excluded.
Like, if that note you signed gets cashed in on after your death, who pays it?
Also, what if someone forges your signature and makes fake notes?
Those are the questions I have about the written promises.