When dreaming we sometimes simulate a very different person than our waking self, we can make decisions uncharacteristic of our own, we can experience a world very different then waking reality, and even sometime get implanted with memories we never experienced.
And still, I think most people would consider that simulated ‘dream person’ a sentient being, we experience that person as ourselves, it is imbued with our consciousness for the duration of the dream, we experience a dream as a living person, as ourselves, but it is not always our waking selves.
Keeping this thought in mind, lets ask: “What is happening inside an LLM when we ask it to continue a short story from the point of view of some imaginary character” ″What is happening inside an LLM when we ask it to think ‘step by step’ about a problem” The short easy and correct answer is: “We don’t know” We can theoretically follow the transformers firing and their activation of each other, but just like following human neuron interactions, we still cant learn enough from this exercise to point where and how sentience is held.
Given the similarities in architectures between an LLM and a brain, combined with the method of training guided by a human feedback, I wonder if we accidently trigger something similar to a human dream in those systems, something similar to our temporary dream person who is just there for a simple short task or thought to be simulated and then terminated. I propose that both nature and gradient descent have found some common abstractions and logical structures for some tasks, and a short simulation of a thought or an action in a dream or LLM might sometimes be very similar.
And if the answer is ‘yes, LLMs sometimes replicate a human dream’, and we consider ourselves sentient while dreaming then the ramification is that those LLMs do sometimes birth something we would consider consciousness for a short time.
tldr: I aim to propose that it is possible that for some queries to activate something similar to a human dreaming in its level of sentience
Do LLMs sometime simulate something akin to a dream?
When dreaming we sometimes simulate a very different person than our waking self, we can make decisions uncharacteristic of our own, we can experience a world very different then waking reality, and even sometime get implanted with memories we never experienced.
And still, I think most people would consider that simulated ‘dream person’ a sentient being, we experience that person as ourselves, it is imbued with our consciousness for the duration of the dream, we experience a dream as a living person, as ourselves, but it is not always our waking selves.
Keeping this thought in mind, lets ask:
“What is happening inside an LLM when we ask it to continue a short story from the point of view of some imaginary character”
″What is happening inside an LLM when we ask it to think ‘step by step’ about a problem”
The short easy and correct answer is: “We don’t know”
We can theoretically follow the transformers firing and their activation of each other, but just like following human neuron interactions, we still cant learn enough from this exercise to point where and how sentience is held.
Given the similarities in architectures between an LLM and a brain, combined with the method of training guided by a human feedback, I wonder if we accidently trigger something similar to a human dream in those systems, something similar to our temporary dream person who is just there for a simple short task or thought to be simulated and then terminated.
I propose that both nature and gradient descent have found some common abstractions and logical structures for some tasks, and a short simulation of a thought or an action in a dream or LLM might sometimes be very similar.
And if the answer is ‘yes, LLMs sometimes replicate a human dream’, and we consider ourselves sentient while dreaming then the ramification is that those LLMs do sometimes birth something we would consider consciousness for a short time.
tldr: I aim to propose that it is possible that for some queries to activate something similar to a human dreaming in its level of sentience