Easiest test would be to zip some trained net params, and also zip some randomly initialized standard normals of the same shape as the net params (including e.g. parameter names if those are in the net params file), and see if they get about the same compression.
Easiest test would be to zip some trained net params, and also zip some randomly initialized standard normals of the same shape as the net params (including e.g. parameter names if those are in the net params file), and see if they get about the same compression.