Basically, K-complexity is treating the very laws of physics as random.
Assuming a lawful universe, doing the distibution per algorithm is better than per universe. But there are many possibilities how to do the distribution per algorithm. Even selecting a different (Turing-complete) programming language provides a different K-complexity.
As I understood the question in the article, if we already do the distribution per algorithm, the only necessary thing is to give a nonzero starting probability to any algorithm. This is enough; the Bayesian updates will then make the following predictions more reliable.
So the question is: assuming that we give nonzero prior probability to any algorithm:
a) is there any advantage in using Kolmogorov complexity? or
b) are in some sense all other priority distributions just variants of K-complexity? or
Assuming a lawful universe, doing the distibution per algorithm is better than per universe. But there are many possibilities how to do the distribution per algorithm. Even selecting a different (Turing-complete) programming language provides a different K-complexity.
As I understood the question in the article, if we already do the distribution per algorithm, the only necessary thing is to give a nonzero starting probability to any algorithm. This is enough; the Bayesian updates will then make the following predictions more reliable.
So the question is: assuming that we give nonzero prior probability to any algorithm:
a) is there any advantage in using Kolmogorov complexity? or
b) are in some sense all other priority distributions just variants of K-complexity? or
c) something else...?