I don’t have an actual theorem, I’m rambling by analogy. I was thinking of something analogous to No Free Lunch in the compression field. There is no compression algorithm that is ideal for all data, but you can say “this one is better than this one” in a given domain. e.g. bzip2 compresses smaller than gzip for almost any input (though it’s more work to apply). Analogously, you shouldn’t expect any decision theory to be “perfect” for all input, just better for a sensible range of inputs.
Of course, I’ve just realised that analogy is flawed, since the compression NFL is that no algorithm can make all possible inputs smaller, whereas for decision theories they can be highly inefficient as long as they’re correct, that being the question asked. But I’m wondering if there’s something NFL-like regardless.
I don’t have an actual theorem, I’m rambling by analogy. I was thinking of something analogous to No Free Lunch in the compression field. There is no compression algorithm that is ideal for all data, but you can say “this one is better than this one” in a given domain. e.g. bzip2 compresses smaller than gzip for almost any input (though it’s more work to apply). Analogously, you shouldn’t expect any decision theory to be “perfect” for all input, just better for a sensible range of inputs.
Of course, I’ve just realised that analogy is flawed, since the compression NFL is that no algorithm can make all possible inputs smaller, whereas for decision theories they can be highly inefficient as long as they’re correct, that being the question asked. But I’m wondering if there’s something NFL-like regardless.