It seems you want to define the complexity of QM by summing over all algorithms that can generate the predictions of QM, rather than just taking the shortest one.
Yes, though to be clear, it is the prior probability associated with the complexity of the individual algorithm that I would sum over to get the prior probability of that common set of predictions being correct. I don’t consider the common set of predictions to have a conceptially useful complexity in the same sense that the algorithms do.
In that case you should probably take the same approach to defining K-complexity of bit strings: sum over all algorithms that print the string, not take the shortest one. Do you subscribe to that point of view?
I would apply the same approach to making predictions about bit strings.
Yes, though to be clear, it is the prior probability associated with the complexity of the individual algorithm that I would sum over to get the prior probability of that common set of predictions being correct. I don’t consider the common set of predictions to have a conceptially useful complexity in the same sense that the algorithms do.
I would apply the same approach to making predictions about bit strings.
Why? Both are bit strings, no?
My computer represents numbers and letters as bit strings. This doesn’t mean it makes sense to multiply letters together.
This is related to a point that I attempted to make previously. You can measure complexity, but you must pick the context appropriately.