That is, in fact, a helpful elaboration! When you said
Most people who “work on AI alignment” don’t even think that thinking is a thing.
my leading hypotheses for what you could mean were:
Using thought, as a tool, has not occured to most such people
Most such people have no concept whatsoever of cognition as being a thing, the way people in the year 1000 had no concept whatsoever of javascript being a thing.
Now, instead, my leading hypothesis is that you mean:
Most such people are failing to notice that there’s an important process, called “thinking”, which humans do but LLMs “basically” don’t do.
This is a bunch more precise! For one, it mentions AIs at all.
That is, in fact, a helpful elaboration! When you said
my leading hypotheses for what you could mean were:
Using thought, as a tool, has not occured to most such people
Most such people have no concept whatsoever of cognition as being a thing, the way people in the year 1000 had no concept whatsoever of javascript being a thing.
Now, instead, my leading hypothesis is that you mean:
Most such people are failing to notice that there’s an important process, called “thinking”, which humans do but LLMs “basically” don’t do.
This is a bunch more precise! For one, it mentions AIs at all.