That sounds wise, but I’m having trouble understanding what it is you are actually saying. How exactly are you defining intelligence and rationality here? Wei Dai gave definitions with demonstrable overlap; you claim they are different. How?
Probably this explains why the distinction between intelligence and rationality makes sense for humans (where some skills are innate, and some skills are learned), but doesn’t necessarily make sense for self-improving AIs.
Intelligence is about our biological limits, which determine how much optimizing power we can produce in short term (on the scale of seconds), which is more or less fixed. Rationality is about using this optimizing power over long term. Intelligence is how much “mental energy” you can generate per second. Rationality is how you use this energy, if you are able to accumulate it, etc.
Seems like in humans, most of this generated energy is wasted, so there can be a great difference between how much “mental energy” you can generate per second, and whether you can accumulate enough “mental energy” to reach your life goals. (Known as: “if you are so smart, why aren’t you rich?”) A hypothetical perfect Bayesian machine could use all the “mental energy” efficiently, so there would be some equation connecting its intelligence and rationality.
That sounds wise, but I’m having trouble understanding what it is you are actually saying. How exactly are you defining intelligence and rationality here? Wei Dai gave definitions with demonstrable overlap; you claim they are different. How?
Intelligence/muscles = a natural faculty every human is born with, in greater or lesser degree.
Rationality/gymastics = a teachable set of techniques that help you refine what you can do with said natural faculty.
Probably this explains why the distinction between intelligence and rationality makes sense for humans (where some skills are innate, and some skills are learned), but doesn’t necessarily make sense for self-improving AIs.
Intelligence is about our biological limits, which determine how much optimizing power we can produce in short term (on the scale of seconds), which is more or less fixed. Rationality is about using this optimizing power over long term. Intelligence is how much “mental energy” you can generate per second. Rationality is how you use this energy, if you are able to accumulate it, etc.
Seems like in humans, most of this generated energy is wasted, so there can be a great difference between how much “mental energy” you can generate per second, and whether you can accumulate enough “mental energy” to reach your life goals. (Known as: “if you are so smart, why aren’t you rich?”) A hypothetical perfect Bayesian machine could use all the “mental energy” efficiently, so there would be some equation connecting its intelligence and rationality.