At first glance, if we’re talking about a thing that requires cooperative effort from many people across time, this seems like a heck of a principal agent problem.
It turns out a whole lot of cooperation is achievable without explicit control. It’s not so much principal-agent, where a principal knows what they want and the agent has different goals. It’s more like agent-agent (or really, principal-principal), where all participants want compatible things, and small individual trades (I’ll let you keep 10% if you mill my grain) add up over time to fairly long chains of behaviors that build bridges and convenience stores and websites where we can discuss the puzzle of cooperation-without-coordination.
I’d argue that this is what “institution” means. A common understanding of what kinds of exchanges and behaviors will be rewarded. They’re bottom-up evolved human mutual expectations, not top-down designed structures. Though, of course, human intent can influence what kinds of culture are prevalent in any given subgroup.
It turns out a whole lot of cooperation is achievable without explicit control. It’s not so much principal-agent, where a principal knows what they want and the agent has different goals. It’s more like agent-agent (or really, principal-principal), where all participants want compatible things, and small individual trades (I’ll let you keep 10% if you mill my grain) add up over time to fairly long chains of behaviors that build bridges and convenience stores and websites where we can discuss the puzzle of cooperation-without-coordination.
I’d argue that this is what “institution” means. A common understanding of what kinds of exchanges and behaviors will be rewarded. They’re bottom-up evolved human mutual expectations, not top-down designed structures. Though, of course, human intent can influence what kinds of culture are prevalent in any given subgroup.