A Digression: This is interesting to me in an interesting way.
Often, I find that thinkers, even seemingly broad thinkers, have one idea, one simple concept that connects to everything they do.
A typical example of this is Nassim Taleb, who says himself that he is concerned with one topic: Randomness.
Eliezer is the other primary person to whom I attribute one key idea (I’d be interested if he, or anyone else disagrees). It seems to me that the one idea that encompasses everything that Eliezer does is Intelligence.
Reading this post, I realized, randomness is the flip-side of the coin, the negative-inverse of intelligence! The absence of intelligence, seen “from the inside”, is randomness. Randomness is just phenomena that are responding to a higher ordering than one’s mind can grasp.
Randomness is a measure of intelligence. The greater one’s intelligence, the less randomness there is (to it).
Randomness is a measure of intelligence. The greater one’s intelligence, the less randomness there is (to it).
While I agree with the gist of what you’re saying, you may wanna rephrase the above sentence (it sounds too general, and is a terrible statement when taken out of context).
The point I’m trying to make is that you can have a low intelligence person/animal/machine/system, which can perceive very little randomness. Therefore, this relation doesn’t hold out on either end of the spectrum. There is very little “randomness” to an entity (read as “perceived by”) in blissful ignorance, and there is very little randomness to a “sufficiently advanced intelligence”.
Furthermore, this isn’t something that happens only in extreme cases. It’s a pattern that can be seen at many levels in many many forms.
It kind of renders the whole point moot. However, I do concede that for a given set of data, within a fixed paradigm, the rule does have some applicability.
A Digression: This is interesting to me in an interesting way.
Often, I find that thinkers, even seemingly broad thinkers, have one idea, one simple concept that connects to everything they do.
A typical example of this is Nassim Taleb, who says himself that he is concerned with one topic: Randomness.
Eliezer is the other primary person to whom I attribute one key idea (I’d be interested if he, or anyone else disagrees). It seems to me that the one idea that encompasses everything that Eliezer does is Intelligence.
Reading this post, I realized, randomness is the flip-side of the coin, the negative-inverse of intelligence! The absence of intelligence, seen “from the inside”, is randomness. Randomness is just phenomena that are responding to a higher ordering than one’s mind can grasp.
Randomness is a measure of intelligence. The greater one’s intelligence, the less randomness there is (to it).
My world just grew a little more connected.
While I agree with the gist of what you’re saying, you may wanna rephrase the above sentence (it sounds too general, and is a terrible statement when taken out of context).
The point I’m trying to make is that you can have a low intelligence person/animal/machine/system, which can perceive very little randomness. Therefore, this relation doesn’t hold out on either end of the spectrum. There is very little “randomness” to an entity (read as “perceived by”) in blissful ignorance, and there is very little randomness to a “sufficiently advanced intelligence”.
Furthermore, this isn’t something that happens only in extreme cases. It’s a pattern that can be seen at many levels in many many forms.
It kind of renders the whole point moot. However, I do concede that for a given set of data, within a fixed paradigm, the rule does have some applicability.
You may be interested in another word, then: entropy.