I like this a lot. I had come to similar conclusions, but from the opposite direction and for only partially related reasons.
My intuition was that nothing is perfectly fungible, save perhaps for currency and that is the singular reason for which it was invented. For example, you can buy food for dollars and water for dollars but if you have beneath a certain threshold of either you will die. You mention non-fungible resources, but invoke sacredness in a broader context.
I was motivated by the problem of effectively preserving intuition when thinking about practical problems. I’ll point to the example of the AI Impacts entry for the discontinuity of nuclear weapons, because it uses cost-effectiveness as the method to investigate discontinuity effects. The motivation is clear—we want to compare like with like as much as possible, and so converting conventional explosives and nuclear explosives into the same unit makes sense. Yet this leaves us vulnerable to a different kind of category error, because we will tend to apply the assumptions of the units to reasoning about the object we are measuring. In this case people can—and arguably the US government did—assume that nuclear weapons are fungible and linear. Yet two bombs are clearly not equivalent to one that is twice as powerful, nor is it reasonable to assume that the more powerful bomb is a better deal if it costs less than twice the price of one. The important intuitions about the object are thus obscured by the metrics.
In service of making this distinction, I like the option of using sacredness as a positive attribute rather than having to describe at length why another assumption is inappropriate. It’s a small and specialized use-case, but one that I have seen a need for lately.
I like this a lot. I had come to similar conclusions, but from the opposite direction and for only partially related reasons.
My intuition was that nothing is perfectly fungible, save perhaps for currency and that is the singular reason for which it was invented. For example, you can buy food for dollars and water for dollars but if you have beneath a certain threshold of either you will die. You mention non-fungible resources, but invoke sacredness in a broader context.
I was motivated by the problem of effectively preserving intuition when thinking about practical problems. I’ll point to the example of the AI Impacts entry for the discontinuity of nuclear weapons, because it uses cost-effectiveness as the method to investigate discontinuity effects. The motivation is clear—we want to compare like with like as much as possible, and so converting conventional explosives and nuclear explosives into the same unit makes sense. Yet this leaves us vulnerable to a different kind of category error, because we will tend to apply the assumptions of the units to reasoning about the object we are measuring. In this case people can—and arguably the US government did—assume that nuclear weapons are fungible and linear. Yet two bombs are clearly not equivalent to one that is twice as powerful, nor is it reasonable to assume that the more powerful bomb is a better deal if it costs less than twice the price of one. The important intuitions about the object are thus obscured by the metrics.
In service of making this distinction, I like the option of using sacredness as a positive attribute rather than having to describe at length why another assumption is inappropriate. It’s a small and specialized use-case, but one that I have seen a need for lately.