Pre-adolescent children haven’t felt strong lust yet. Those of us who’ve avoided strong pain are also missing an experience of that affect. Nostalgia can come up very early, but does require a bit of living first. Depression can strike at any age.
So, in general, there are emotions and feelings that people are capable of feeling, in the right circumstances, but that some people have not yet felt.
As a thought experiment, I’d thus like to introduce the emotion of exiert, which no human being has yet felt, because it’s only triggered by (very) unusual experiences. Maybe it’s something that can only be felt after a decade of weightlessness, or after the human brain has adapted to the living for a long time within the atmosphere of a gas giant.
Let’s assume that exiert has some survival uses—it’s helpful in navigating the gas giant’s ever-changing geography, say—and it can load onto both positive and negative affect (just as confusion and surprise can). Assume also that experiencing exiert can have impact on people’s aethetic preferences—it can cause them to like certain colour schemes, or open them to different types of pacing and tone in films or theatre productions.
In the spirit of trying to deduce what human values really are, the question is then: is the AI overriding human preferences in any of the following situations:
The AI premeptively precludes the experience of exiert, so that we won’t experience this emotion even if we are put in the unusual circumstances.
Suppose that we wouldn’t “naturally” experience exiert, but the AI acts to ensure that we would (in those unusual circumstances).
The AI acts to ensure that some human that had not yet experienced strong lust, nostalgia, or extreme pain, could never experience that emotion.
Suppose some human would not “naturally” experience strong lust, nostalgia, or extreme pain, but the AI acts to ensure that they would.
The AI acts to ensure that some human experiences exiert, but that this is triggered by normal circumstances, rather than unusual ones.
Have you felt exiert yet?
Pre-adolescent children haven’t felt strong lust yet. Those of us who’ve avoided strong pain are also missing an experience of that affect. Nostalgia can come up very early, but does require a bit of living first. Depression can strike at any age.
So, in general, there are emotions and feelings that people are capable of feeling, in the right circumstances, but that some people have not yet felt.
As a thought experiment, I’d thus like to introduce the emotion of exiert, which no human being has yet felt, because it’s only triggered by (very) unusual experiences. Maybe it’s something that can only be felt after a decade of weightlessness, or after the human brain has adapted to the living for a long time within the atmosphere of a gas giant.
Let’s assume that exiert has some survival uses—it’s helpful in navigating the gas giant’s ever-changing geography, say—and it can load onto both positive and negative affect (just as confusion and surprise can). Assume also that experiencing exiert can have impact on people’s aethetic preferences—it can cause them to like certain colour schemes, or open them to different types of pacing and tone in films or theatre productions.
In the spirit of trying to deduce what human values really are, the question is then: is the AI overriding human preferences in any of the following situations:
The AI premeptively precludes the experience of exiert, so that we won’t experience this emotion even if we are put in the unusual circumstances.
Suppose that we wouldn’t “naturally” experience exiert, but the AI acts to ensure that we would (in those unusual circumstances).
The AI acts to ensure that some human that had not yet experienced strong lust, nostalgia, or extreme pain, could never experience that emotion.
Suppose some human would not “naturally” experience strong lust, nostalgia, or extreme pain, but the AI acts to ensure that they would.
The AI acts to ensure that some human experiences exiert, but that this is triggered by normal circumstances, rather than unusual ones.