What the linked article is suggesting isn’t that humans suffer less intensely because we can express things in words, but that humans suffer less intensely because we care about other things besides pain (I would want to qualify that suggestion by saying that for exactly that reason we can also suffer more intensely than animals, when what ails us combines physical pain with dread, disappointment, the suffering of others, etc.).
If Bing suffers less than non-human animals do[1], I think it’s because of the “cannot feel” part, not the “can verbalize” part.
[1] I don’t think Bing suffers at all, in any morally relevant sense. Though I think this stuff is complicated and confusing and I darkly suggest that this sort of question doesn’t actually have a well-defined answer, even if we ask only about a single person’s values, never mind those of the human race as a whole or The One Objectively Right Value System if there is one.
Actually you are right, it makes more sense as two independent axes. ‘Suffering’ on one axis (X) and ‘Verbal ability’ on the other (Y)-- with Tinker Bell on max(X) and high Y, animals on max(X), min(Y), humans (almost?) max(X), max(Y), LLMs on min(X), increasing Y.
It is in fact the independence of the two axes that was interesting, I botched that.
Your continuum feels wrong to me.
What the linked article is suggesting isn’t that humans suffer less intensely because we can express things in words, but that humans suffer less intensely because we care about other things besides pain (I would want to qualify that suggestion by saying that for exactly that reason we can also suffer more intensely than animals, when what ails us combines physical pain with dread, disappointment, the suffering of others, etc.).
If Bing suffers less than non-human animals do[1], I think it’s because of the “cannot feel” part, not the “can verbalize” part.
[1] I don’t think Bing suffers at all, in any morally relevant sense. Though I think this stuff is complicated and confusing and I darkly suggest that this sort of question doesn’t actually have a well-defined answer, even if we ask only about a single person’s values, never mind those of the human race as a whole or The One Objectively Right Value System if there is one.
Actually you are right, it makes more sense as two independent axes. ‘Suffering’ on one axis (X) and ‘Verbal ability’ on the other (Y)-- with Tinker Bell on max(X) and high Y, animals on max(X), min(Y), humans (almost?) max(X), max(Y), LLMs on min(X), increasing Y.
It is in fact the independence of the two axes that was interesting, I botched that.