Actually, there is one spuriosity I want to draw attention to as an example. This isn’t just pointing out a fake quotation, non-existent link, or simple falsehood. Exhibit A:
It gave birth to the idea that something referred to by a sequence of symbols could be automated; that a sequence of events could be executed for us by a machine. This necessitates that the binding of those symbols to their referents – the operation of signification – be itself automated. Human thought has shown itself most adept at automating this process of signification. To think, we must be able to formulate and interpret representations of thoughts and entities independent of the mental or physical subject in question. Slowly, we have learned to build machines that can do the same.
The first sentence of this will do. But the remainder is fog. It does not matter whether this was generated by a language model or an unassisted human, it’s still fog, although at least in the latter case there is the possibility of opening a conversation to search for something solid.
A lot of human-written text is like that. The Heidegger quote is, as far as I can see, spurious, but I would not expect Heidegger himself to make any more sense, or Bruno Latour, who is “quoted” later. All texts have to be scrutinised to determine what is fog and what is solid, even before the language models came along and cast everything into doubt. That is the skill of reading, which includes the texts one writes oneself. Foggy words are a sign of foggy thought.
Certainly more skilled writers are more clear, but if you routinely dismiss unclear texts as meaningless nonsense, you haven’t gotten good at reading but rather goodharted your internal metrics.
There is nothing routine about my dismissal of the text in question. Remember, this is not the work of a writer, skilled or otherwise. It is AI slop (and if the “author” has craftily buried some genuine pearls in the shit, they cannot complain if they go undiscovered).
If you think the part I quoted (or any other part) means something profound, perhaps you could expound your understanding of it. You yourself have written on the unreliability of LLM output, and this text, in the rare moments when it says something concrete, contains just as flagrant confabulations.
Actually, there is one spuriosity I want to draw attention to as an example. This isn’t just pointing out a fake quotation, non-existent link, or simple falsehood. Exhibit A:
The first sentence of this will do. But the remainder is fog. It does not matter whether this was generated by a language model or an unassisted human, it’s still fog, although at least in the latter case there is the possibility of opening a conversation to search for something solid.
A lot of human-written text is like that. The Heidegger quote is, as far as I can see, spurious, but I would not expect Heidegger himself to make any more sense, or Bruno Latour, who is “quoted” later. All texts have to be scrutinised to determine what is fog and what is solid, even before the language models came along and cast everything into doubt. That is the skill of reading, which includes the texts one writes oneself. Foggy words are a sign of foggy thought.
To the skilled reader, human-authored texts are approximately never foggy.
The sufficiently skilled writer does not generate foggy texts. Bad writers and current LLMs do so easily.
Certainly more skilled writers are more clear, but if you routinely dismiss unclear texts as meaningless nonsense, you haven’t gotten good at reading but rather goodharted your internal metrics.
There is nothing routine about my dismissal of the text in question. Remember, this is not the work of a writer, skilled or otherwise. It is AI slop (and if the “author” has craftily buried some genuine pearls in the shit, they cannot complain if they go undiscovered).
If you think the part I quoted (or any other part) means something profound, perhaps you could expound your understanding of it. You yourself have written on the unreliability of LLM output, and this text, in the rare moments when it says something concrete, contains just as flagrant confabulations.