Yeah, exactly—the problem is that there are some small-volume functions which are actually simple. The argument for small-volume --> complex doesn’t go through since there could be other ways of specifying the function.
Other senses of simplicity include various circuit complexities and Levin complexity. There’s no argument that parameter-space volume corresponds to either of them AFAIK(you might think parameter-space volume would correspond to “neural net complexity”, the number of neurons in a minimal-size neural net needed to compute the function, but I don’t think this is true either—every parameter is Gaussian so it’s unlikely for most to be zero)
Yeah, exactly—the problem is that there are some small-volume functions which are actually simple. The argument for small-volume --> complex doesn’t go through since there could be other ways of specifying the function.
Other senses of simplicity include various circuit complexities and Levin complexity. There’s no argument that parameter-space volume corresponds to either of them AFAIK(you might think parameter-space volume would correspond to “neural net complexity”, the number of neurons in a minimal-size neural net needed to compute the function, but I don’t think this is true either—every parameter is Gaussian so it’s unlikely for most to be zero)