Where does the value of knowledge come from? Why is compressing that knowledge adding to that value? Are you referring to knowledge in general or thinking about knowledge within a specific domain?
In my personal experience, finding an application for knowledge always outstrips the value of new knowledge. For example, I may learn the name of every single skipper of a Americas Cup yacht over the entire history of the event: but that would not be very valuable to me as there is no opportunity to exploit it. I may even ‘compress’ it for easy recall by means of a humorous menomic, like Bart Simpson’s mnemonic for Canada’s Governor General[1]s, or Robert Downey Jr’s technique of turning the first letter of every one of his lines in a scene into an acrostic. However unless called upon to recite a list of America’s Cup Skippers, Canada’s first Governor Generals, or the dialogue in a Robert Downey Jr. film—when does this compression add any value?
Indeed, finding new applications for knowledge we already have always has the advantage of the opportunity cost against acquiring new knowledge. For example, every time an app or a website changes it’s UI, there is always a lag or delay in accomplishing the same task as I now need to reorient or even learn a new procedure for accomplishing the same task.
“Clowns Love Hair-Cuts, so Should Lee Marvin’s Valet”—Charles, Lisgar, Hamilton, Campbell, Landsdowne, Stanley (Should-ley), Murray-Kynynmound, and ‘valet’ rhymes with “Earl Grey” is my best guess.
Compression is what happens when you notice that 2 things share the same structure, and your brain just kinda fuses the shared aspects of the mental objcts together into a single thing. Compression = Abstraction = Analogy = Metaphor. Compression = Eureka moments. And the amazing thing is the brain performs cognition on compressed data just as fast as on original data, effectively increasing your cognitive speed.
For example, I think there’s large value in merging as much of your everyday observational data of humans as feasible together into abstracted psychology concepts, and I wanna understand models like BigFive (as far as they’re correct) much better on intuitive levels.
Compressing existing knowledge >> Acquiring new knowledge
Don’t spend all your time compressing knowledge that’s not that useful to begin with, if there are higher value things to be learned.
True!
Useless knowledge should neither be learned nor compressed, as both takes cognition.
Where does the value of knowledge come from? Why is compressing that knowledge adding to that value? Are you referring to knowledge in general or thinking about knowledge within a specific domain?
In my personal experience, finding an application for knowledge always outstrips the value of new knowledge.
For example, I may learn the name of every single skipper of a Americas Cup yacht over the entire history of the event: but that would not be very valuable to me as there is no opportunity to exploit it. I may even ‘compress’ it for easy recall by means of a humorous menomic, like Bart Simpson’s mnemonic for Canada’s Governor General[1]s, or Robert Downey Jr’s technique of turning the first letter of every one of his lines in a scene into an acrostic. However unless called upon to recite a list of America’s Cup Skippers, Canada’s first Governor Generals, or the dialogue in a Robert Downey Jr. film—when does this compression add any value?
Indeed, finding new applications for knowledge we already have always has the advantage of the opportunity cost against acquiring new knowledge. For example, every time an app or a website changes it’s UI, there is always a lag or delay in accomplishing the same task as I now need to reorient or even learn a new procedure for accomplishing the same task.
“Clowns Love Hair-Cuts, so Should Lee Marvin’s Valet”—Charles, Lisgar, Hamilton, Campbell, Landsdowne, Stanley (Should-ley), Murray-Kynynmound, and ‘valet’ rhymes with “Earl Grey” is my best guess.
The way I put that may have been overly obscure
But I’ve come to refer in my mind to the way the brain does chunking of information and noticing patterns and parallels in it for easier recall and use as just Compression.
Compression is what happens when you notice that 2 things share the same structure, and your brain just kinda fuses the shared aspects of the mental objcts together into a single thing. Compression = Abstraction = Analogy = Metaphor. Compression = Eureka moments. And the amazing thing is the brain performs cognition on compressed data just as fast as on original data, effectively increasing your cognitive speed.
For example, I think there’s large value in merging as much of your everyday observational data of humans as feasible together into abstracted psychology concepts, and I wanna understand models like BigFive (as far as they’re correct) much better on intuitive levels.