what’s wrong with stopping at red lights? I do it, and it keeps me alive.
(It literally just occurred to me that you might have been using ‘red lights’ as a metaphor for reactions from people you don’t like. So that you mean that you’ve learned to stop or change direction if you get these signals. Was this what you meant?)
what’s wrong with stopping at red lights? I do it, and it keeps me alive.
There is nothing wrong with stopping at red lights. There is something wrong with believing you should stop at red lights simply because you think you should. The belief should be anchored somewhere.
A major clarification that may help: The matrix does not provide a reason to action in any given scenario. It just remembers how to act in a given scenario. There is no “updating” in the sense that the belief is accurate or inaccurate. The belief can change or grow but it isn’t correct or wrong. But even though the belief isn’t accurate, inaccurate, right, or wrong the system still considers them beliefs.
(It literally just occurred to me that you might have been using ‘red lights’ as a metaphor for reactions from people you don’t like. So that you mean that you’ve learned to stop or change direction if you get these signals. Was this what you meant?)
So do you think there’s a human system which includes a closer approximation of reality? (whatever that means)
I think my question is related to the above:
what’s wrong with stopping at red lights? I do it, and it keeps me alive.
(It literally just occurred to me that you might have been using ‘red lights’ as a metaphor for reactions from people you don’t like. So that you mean that you’ve learned to stop or change direction if you get these signals. Was this what you meant?)
There is nothing wrong with stopping at red lights. There is something wrong with believing you should stop at red lights simply because you think you should. The belief should be anchored somewhere.
A major clarification that may help: The matrix does not provide a reason to action in any given scenario. It just remembers how to act in a given scenario. There is no “updating” in the sense that the belief is accurate or inaccurate. The belief can change or grow but it isn’t correct or wrong. But even though the belief isn’t accurate, inaccurate, right, or wrong the system still considers them beliefs.
No. I meant traffic signals.
What do you mean by human system? I think The Simple Truth provides a much better system for beliefs.