I would say the likeliest explanation is that people do care, but only insofar as it enables them to signal that they care. Caring much farther than that is pretty much pointless, from an evolutionary perspective, and probably actively detrimental.
Unless, of course, the machinery for caring is much simpler when it’s simply “care” vs “not care”. Pretending to care could be a much more complicated neurological adaptation that would be more wasteful than just implementing a nice “Sympathy” subsystem.
I mean, the way humans model each other’s behavior is by looking at our own self in other people’s scenarios, and then making minor adjustments for accuracy’s sake, since they think a little differently. I mean, why would you invent an entire subsystem just for understanding other people? That’s insane! YOU HAVE AN ENTIRE BRAIN ALREADY, AND EVERYBODY’S BRAIN IS REALLY DAMN SIMILAR, RIGHT?
Now, once you have this “self modeling and adjustment” system in place, actual caring makes a lot of sense. oh, pretending to care about your family and tribe is useful? here, i’ll just slap on an extra module here. we’ll call it Sympathy. It kinda works like this: you’v got that model of your brother you use to predict his behavior, right? It’s running on the same hardware YOUR mind is, and you care about YOUR mind, right? so we’ll just switch that “care” knob to the on position for your bro, alright? There. ha. now he’ll respond by viewing you as a valuable resource!
That “Really Intelligent Sociopath Who Accurately Predicts Every Scenario In Which Pretending To Care Is Useful And Rarely Makes Wasteful Mistakes Regarding That Decision” module seems a little bit costly to implement. Honestly, it’s just easier to actually freaking care about other people, albeit in a silly, ignorantly applied human way.
(I side with Harry on this one, in case you couldn’t tell.)
I would say the likeliest explanation is that people do care, but only insofar as it enables them to signal that they care. Caring much farther than that is pretty much pointless, from an evolutionary perspective, and probably actively detrimental.
Unless, of course, the machinery for caring is much simpler when it’s simply “care” vs “not care”. Pretending to care could be a much more complicated neurological adaptation that would be more wasteful than just implementing a nice “Sympathy” subsystem.
I mean, the way humans model each other’s behavior is by looking at our own self in other people’s scenarios, and then making minor adjustments for accuracy’s sake, since they think a little differently. I mean, why would you invent an entire subsystem just for understanding other people? That’s insane! YOU HAVE AN ENTIRE BRAIN ALREADY, AND EVERYBODY’S BRAIN IS REALLY DAMN SIMILAR, RIGHT?
Now, once you have this “self modeling and adjustment” system in place, actual caring makes a lot of sense. oh, pretending to care about your family and tribe is useful? here, i’ll just slap on an extra module here. we’ll call it Sympathy. It kinda works like this: you’v got that model of your brother you use to predict his behavior, right? It’s running on the same hardware YOUR mind is, and you care about YOUR mind, right? so we’ll just switch that “care” knob to the on position for your bro, alright? There. ha. now he’ll respond by viewing you as a valuable resource!
That “Really Intelligent Sociopath Who Accurately Predicts Every Scenario In Which Pretending To Care Is Useful And Rarely Makes Wasteful Mistakes Regarding That Decision” module seems a little bit costly to implement. Honestly, it’s just easier to actually freaking care about other people, albeit in a silly, ignorantly applied human way.
(I side with Harry on this one, in case you couldn’t tell.)
I dunno.