To me this distinction is what makes consciousness distinct and special. I think it is a fascinating consequence of a certain pattern of interacting systems. Implying that conscious feelings occur all over the place, perhaps every feedback system is feeling something.
This sounds like the point Pinker makes in How the Mind Works—that apart from the problem of consciousness, concepts like “thinking” and “knowing” and “talking” are actually very simple:
(...) Ryle and other philosophers
argued that mentalistic terms such as “beliefs,” “desires,” and
“images” are meaningless and come from sloppy misunderstandings of
language, as if someone heard the expression “for Pete’s sake” and went
around looking for Pete. Simpatico behaviorist psychologists claimed
that these invisible entities were as unscientific as the Tooth Fairy and
tried to ban them from psychology.
And then along came computers: fairy-free, fully exorcised hunks of
metal that could not be explained without the full lexicon of mentalistic
taboo words. “Why isn’t my computer printing?” “Because the program
doesn’t know you replaced your dot-matrix printer with a laser printer. It
still thinks it is talking to the dot-matrix and is trying to print the document
by asking the printer to acknowledge its message. But the printer
doesn’t understand the message; it’s ignoring it because it expects its input
to begin with ‘%!’ The program refuses to give up control while it polls the
printer, so you have to get the attention of the monitor so that it can wrest
control back from the program. Once the program learns what printer is
connected to it, they can communicate.” The more complex the system
and the more expert the users, the more their technical conversation
sounds like the plot of a soap opera.
Behaviorist philosophers would insist that this is all just loose talk.
The machines aren’t really understanding or trying anything, they
would say; the observers are just being careless in their choice of
words and are in danger of being seduced into grave conceptual
errors. Now, what is wrong with this picture? The philosophers are
accusing the computer scientists of fuzzy thinking? A computer is the
most legalistic, persnickety, hard-nosed, unforgiving demander of
precision and explicitness in the universe. From the accusation you’d
think it was the befuddled computer scientists who call a philosopher
when their computer stops working rather than the other way around.
A better explanation is that computation has finally demystified mentalistic
terms. Beliefs are inscriptions in memory, desires are goal
inscriptions, thinking is computation, perceptions are inscriptions
triggered by sensors, trying is executing operations triggered by a
goal.
This sounds like the point Pinker makes in How the Mind Works—that apart from the problem of consciousness, concepts like “thinking” and “knowing” and “talking” are actually very simple: