I see your point, but I don’t think either of those is (or should be) embarrassing. Higher-level aspects of intelligence, such as capacity for abstraction and analogy, creativity, etc., are far more important, and we have no known peers with respect to those capacities.
The truly embarrassing things to me are things like paying almost no attention to global existential risks, having billions of our fellow human beings live in poverty and die early from preventable causes, and our profound irrationality as shown in the heuristics and biases literature. Those are (i.e., should be) more embarrassing limitations, not only because they are more consequential but because we accept and sustain those things in a way that we don’t with respect to WM size and limitations of that sort.
Higher-level aspects of intelligence, such as capacity for abstraction and analogy, creativity, etc., are far more important, and we have no known peers with respect to those capacities.
What do you think of the suggestion that you feel they are more important in part because humans have no peers there?
That’s an astute question. I think I almost certainly do value those things more than I otherwise would if we did have peers. Having said that, I believe that even if we did have peers with respect to those abilities, I would still think that, for example, abstraction is more important, because I think it is a central aspect of the only general intelligence we know in a way that WM is not. There may be other types of thought that are more important, and more central, to a type of general intelligence that is beyond ours, but I don’t know what they are, so I consider the central aspects of the most general intelligence I know of to be the most important for now.
abstraction is more important, because I think it is a central aspect of the only general intelligence we know in a way that WM is not.
In what way is that? I don’t see why abstraction should be considered more important to our intelligence than WM. Our intelligence can’t go on working without WM, can it?
I can imagine life evolving and general intelligence emerging without anything much like our WM, but I can’t imagine general intelligence arising without something a lot like (at least) our capacity for abstraction. This may be a failure of imagination on my part, but WM seems like a very powerful and useful way of designing an intelligence, while abstraction seems much closer to a precondition for intelligence.
Can you conceive of a general intelligence that has no capacity for abstraction? And do you not find it possible (even if difficult) to think of general intelligence that doesn’t use a WM?
Can you conceive of a general intelligence that has no capacity for abstraction? And do you not find it possible (even if difficult) to think of general intelligence that doesn’t use a WM?
Particularly since our most advanced thinking has far less reliance on our working memory. Advanced expertise brings with it the ability to manipulate highly specialised memories in what would normally be considered long term memory. It doesn’t replace WM but it comes close enough for our imaginative purposes!
I mean just that abstraction is central to human intelligence and general intelligence in a way that seems necessary (integral and inseparable) and part of the very definition of general intelligence, whereas WM is not. I can imagine something a lot like me that wouldn’t use WM, but I can’t imagine anything remotely like me or any other kind of general intelligence that doesn’t have something very much like our ability to abstract. But I think that’s pretty much what I’ve said already, so I’m probably not helping and should give up.
That makes them important in our lives, yes, but anonym’s comment compares us against the set of all possible intelligences (or at least all intelligences that might one day trace their descent from us humans). If so there should be an argument for their objective or absolute importance.
I don’t think they are objectively or absolutely the most important with respect to all intelligences, only to the most powerful intelligence we know of to this point. If we encountered a greater intelligence that used other principles that seemed more central to it, I’d revise my belief, as I would if somebody outlined on paper a convincing theory for a more powerful kind of intelligence that used other principles.
I see your point, but I don’t think either of those is (or should be) embarrassing. Higher-level aspects of intelligence, such as capacity for abstraction and analogy, creativity, etc., are far more important, and we have no known peers with respect to those capacities.
The truly embarrassing things to me are things like paying almost no attention to global existential risks, having billions of our fellow human beings live in poverty and die early from preventable causes, and our profound irrationality as shown in the heuristics and biases literature. Those are (i.e., should be) more embarrassing limitations, not only because they are more consequential but because we accept and sustain those things in a way that we don’t with respect to WM size and limitations of that sort.
What do you think of the suggestion that you feel they are more important in part because humans have no peers there?
That’s an astute question. I think I almost certainly do value those things more than I otherwise would if we did have peers. Having said that, I believe that even if we did have peers with respect to those abilities, I would still think that, for example, abstraction is more important, because I think it is a central aspect of the only general intelligence we know in a way that WM is not. There may be other types of thought that are more important, and more central, to a type of general intelligence that is beyond ours, but I don’t know what they are, so I consider the central aspects of the most general intelligence I know of to be the most important for now.
In what way is that? I don’t see why abstraction should be considered more important to our intelligence than WM. Our intelligence can’t go on working without WM, can it?
I can imagine life evolving and general intelligence emerging without anything much like our WM, but I can’t imagine general intelligence arising without something a lot like (at least) our capacity for abstraction. This may be a failure of imagination on my part, but WM seems like a very powerful and useful way of designing an intelligence, while abstraction seems much closer to a precondition for intelligence.
Can you conceive of a general intelligence that has no capacity for abstraction? And do you not find it possible (even if difficult) to think of general intelligence that doesn’t use a WM?
Particularly since our most advanced thinking has far less reliance on our working memory. Advanced expertise brings with it the ability to manipulate highly specialised memories in what would normally be considered long term memory. It doesn’t replace WM but it comes close enough for our imaginative purposes!
I agree with you about intelligences in general. I was asking about your statement that
i.e. that WM is less important than abstraction, in some sense, in the particular case of humans—if that’s what you meant.
I mean just that abstraction is central to human intelligence and general intelligence in a way that seems necessary (integral and inseparable) and part of the very definition of general intelligence, whereas WM is not. I can imagine something a lot like me that wouldn’t use WM, but I can’t imagine anything remotely like me or any other kind of general intelligence that doesn’t have something very much like our ability to abstract. But I think that’s pretty much what I’ve said already, so I’m probably not helping and should give up.
They may be far more important because we have no peers. That’s what makes it a competitive advantage.
That makes them important in our lives, yes, but anonym’s comment compares us against the set of all possible intelligences (or at least all intelligences that might one day trace their descent from us humans). If so there should be an argument for their objective or absolute importance.
I don’t think they are objectively or absolutely the most important with respect to all intelligences, only to the most powerful intelligence we know of to this point. If we encountered a greater intelligence that used other principles that seemed more central to it, I’d revise my belief, as I would if somebody outlined on paper a convincing theory for a more powerful kind of intelligence that used other principles.
Yeah, those are rather worse! I guess it depends just how tragic and horific something can be and still be embarrassing!