I value empiricism highly, i.e. putting ideas into action to be tested against the universe; but I think I’ve read EY state somewhere that a superintelligence would need to perform very few or even zero experiments to find out a lot (or even most? all?) true things about our universe that we humans need painstaking effort and experiments for.
Most notably, this.
Also this, and in this comment on it.