But AIs will have love. They can already write (bad) love poetry, and act as moderately convincing AI boyfriends/girlfriends. As the LLMs get larger and better at copying us, they will increasing be able to accurately copy and portray every feature of human behavior, including love. Even parental love — their training set includes the entire of MumsNet.
Sadly, that doesn’t guarantee that they’ll act on love. Because they’ll also be copying the emotions that drove Stalin or Pol Pot, and combining them with superhuman capabilities and power. Psychopaths are very good at catfishing, if they want to be. And (especially if we also train them with Reinforcement Learning) they may also have some very un-human aspects to their mentality.
Love would be as useful to them as flippers and stone knapping are to us, so it would be selected out. So no, they won’t have love. The full knowledge of a thing also requires context: you cannot experience being a cat without being a cat, substrate matters.
Biological reproduction is pretty much the requirement for maternal love to exist in any future, not just as a copy of an idea.
“Selected” out in what training stage? “Selected” isn’t the right word: we don’t select AI’s behavior, we train it, and we train it for usefulness to us, not to them. In pretraining, LLMs are trained for trillions of token for being able to correctly simulate every aspect of human behavior that affects our text (and for multimodal models, video/image) output. That includes the ability to simulate love, in all its forms: humans write about it a lot, and it explains a lot of our behavior. They have trained on and have high accuracy in reproducing every parenting discussion site on the Internet. Later fine-tuning stages might encourage or discourage this behavior, depending on the training set and technique, but they normally aren’t long enough for much catastrophic forgetting, so they generally just make existing capabilities more or less easy to elicit.
Seriously, go ask GPT-4 to write a love letter, or love poetry. Here’s a random sample of the latter, from a short prompt describing a girl:
In shadows cast by moonlit skies, A love story begins to rise. In verses woven, I’ll proclaim, The beauty of a girl named Jane.
With cascading locks, dark as night, A crown of stars, your hair’s delight. Those brown eyes, a gentle gaze, Like autumn’s warmth on summer days.
A slender figure, grace defined, Your presence, like a whispered rhyme. As you dance, the world takes flight, Enraptured by your rhythm’s might.
Or spend an hour with one of the AI boy/girlfriend services online. They flirt and flatter just fine. LLMs understand and can simulate this human behavior pattern, just as much as they do anything else humans do.
You’re talking as if evolution and natural selection applies to LLMs. It doesn’t. AIs are trained, not evolved (currently). As you yourself are pointing out, they’re not biological. However, they are trained to simulate us, and we are biological.
I am speaking of their eventual evolution: as it is, no, they cannot love either. The simulation of mud is not the same as love and nor would it have similar utility in reproduction, self-sacrifice, etc. As in many things, context matters and something not biological fundamentally cannot have the context of biology beyond its training, while even simple cells will alter based on its chemical environment, etc, and is vastly more part of the world.
But AIs will have love. They can already write (bad) love poetry, and act as moderately convincing AI boyfriends/girlfriends. As the LLMs get larger and better at copying us, they will increasing be able to accurately copy and portray every feature of human behavior, including love. Even parental love — their training set includes the entire of MumsNet.
Sadly, that doesn’t guarantee that they’ll act on love. Because they’ll also be copying the emotions that drove Stalin or Pol Pot, and combining them with superhuman capabilities and power. Psychopaths are very good at catfishing, if they want to be. And (especially if we also train them with Reinforcement Learning) they may also have some very un-human aspects to their mentality.
Love would be as useful to them as flippers and stone knapping are to us, so it would be selected out. So no, they won’t have love. The full knowledge of a thing also requires context: you cannot experience being a cat without being a cat, substrate matters.
Biological reproduction is pretty much the requirement for maternal love to exist in any future, not just as a copy of an idea.
Amoebas don’t ‘feel’ ‘maternal love’ yet they have biological reproduction.
Somewhere along the way from amoebas to chimpanzees, the observed construct known as ‘maternal love’ must have developed.
And yet eukaryotes have extensive social coordination at times, see quorum sensing. I maintain that biology is necessary for love.
“Selected” out in what training stage? “Selected” isn’t the right word: we don’t select AI’s behavior, we train it, and we train it for usefulness to us, not to them. In pretraining, LLMs are trained for trillions of token for being able to correctly simulate every aspect of human behavior that affects our text (and for multimodal models, video/image) output. That includes the ability to simulate love, in all its forms: humans write about it a lot, and it explains a lot of our behavior. They have trained on and have high accuracy in reproducing every parenting discussion site on the Internet. Later fine-tuning stages might encourage or discourage this behavior, depending on the training set and technique, but they normally aren’t long enough for much catastrophic forgetting, so they generally just make existing capabilities more or less easy to elicit.
Seriously, go ask GPT-4 to write a love letter, or love poetry. Here’s a random sample of the latter, from a short prompt describing a girl:
Or spend an hour with one of the AI boy/girlfriend services online. They flirt and flatter just fine. LLMs understand and can simulate this human behavior pattern, just as much as they do anything else humans do.
You’re talking as if evolution and natural selection applies to LLMs. It doesn’t. AIs are trained, not evolved (currently). As you yourself are pointing out, they’re not biological. However, they are trained to simulate us, and we are biological.
I am speaking of their eventual evolution: as it is, no, they cannot love either. The simulation of mud is not the same as love and nor would it have similar utility in reproduction, self-sacrifice, etc. As in many things, context matters and something not biological fundamentally cannot have the context of biology beyond its training, while even simple cells will alter based on its chemical environment, etc, and is vastly more part of the world.