A scientific mindset has a lower probability of being positive expected value because there is more than one value when it comes to making decisions, sometimes in conflict with each other. This can lead to cognitive dissonance in daily life. It’s because science is a tool, the best one we got. Aligning with reality has a higher probability as it’s an emotional heuristic, with only one value necessary.
Aligning with reality means submitting yourself emotionally, similar to how a religious person submits to God, but in this case, our true creator: To logic, where it is defined here as “the consistent patterns which bring about reality”. Then you accept facts fully. You understand how everything is probabilities, as per one interpretation of quantum mechanics and that experience is a tool rather than a goal. Using inductive reasoning and deciding actions as per positive expected value allows you to accept facts and be aligned with reality.
It’s hard if you keep thinking binary, whether it be absolutes or not, 1′s or 0′s. Because to be able to accept facts it to be able to accept one might be wrong, everything is probabilities, infinite possibilities. Practically, if you know exercising every day is positive expected value, for example, then as you align yourself with reality in every moment, you realize even if you injure yourself accidentally today, you won’t give up reality. Because you made the most efficient action as per your knowledge and you already accounted for the probability of accidentally injuring yourself.
So as you keep feeling you also upgrade it with the probabilities to keep your emotions aligned with reality and easier able to handle situations as I mentioned above, however, maybe something more specific if someone breaks your trust. You already took it in consideration so you won’t completely lose trust and emotions for reality.
When you accept and align yourself with reality, then the facts which underlie it, with our current understandings and as long as the likelihood is high, you keep aligning yourself. Experience truly is a feedback loop which results in whatever you feed it.
Regarding what aligning with reality entails:
When you’re constantly aligning yourself to reality, as long as you deem the probability high you’ll be able to emotionally resonate with insights gained. For example, neuroscience will tell you, that you and your environment are not separate from each other, it’s all a part of your neural activity. So helping another is helping you. If that doesn’t resonate enough, for example, evolutionary biology that we’re all descendants from stardust might. Or that there is a probability that you don’t exist (as per QM) although very small. So what happens? Your identity and self vanishes, as it’s no longer aligned with reality, you accept facts, emotionally.
Then you keep the momentum by doing logical actions as per positive expected value after you learn everything what truly is you, and so on.
It’s about what Einstein believed in and Carl Sagan, Spinoza’s. However Einstein couldn’t accept QM because he was thinking in absolutes already, and was unaware of how the brain works. Which we do now, for example, know we’re all inherently in denial, and how memory storage works, etc. If he knew that he might have had a different view.
I can’t really fix up this text right now but I hope it can somehow help for you to understand what it means to align with reality. It’s really important to accept that experience is a tool, not a goal, from insights from evolutionary biology for example. Then there is reality. Who is aligning, if there is only reality?
I think there is an irreconcilable tension between your statement that one should completely emotionally submit to and align with facts, and that one should use a Bayesian epistemology to manage beliefs.
There are many things in life and in science that I’m very certain about, but by the laws of probability I can never be 100% certain. There are many more things that I am less than certain about, and hold a cloud of possible explanations, the most likely of which may only be 20% probable in my estimation. I should only “submit” to any particular belief in accordance with my assessment of its likelihood, and can never justify submitting to some belief 100%. Indeed, doing so would be a form of irrational fundamentalism.
For example, neuroscience will tell you, that you and your environment are not separate from each other, it’s all a part of your neural activity. So helping another is helping you. If that doesn’t resonate enough, for example, evolutionary biology that we’re all descendants from stardust might. Or that there is a probability that you don’t exist (as per QM) although very small. So what happens? Your identity and self vanishes, as it’s no longer aligned with reality, you accept facts, emotionally.
I feel it might help you to know that none of this is actually factual. These are your interpretations of really vague and difficult-to-pin-down philosophical ideas, ideas about which very smart and well-read people can and do disagree.
For example, the idea that you and your environment are not separate from each other may be true in some narrow technical sense but it is also very much false in probably more relevant senses. The same could be said for the idea that helping another is helping yourself. That’s not true if the other I’m helping is trying to murder me—and if I can refute the generality with one example that I came up with in half a second of thought, it’s not a very useful generality.
I suspect that you haven’t read through all of Eliezer’s blog posts. His writings cover all the things you’re talking about, but do it in a way that is grounded in much sturdier foundations than you appear to be using. It also seems that you are very much in love with this idea of Logic as being the One Final Solution to Everything, and that is always a huge danger sign in human thinking. Just thinking probablistically, the odds that the true Final Solution to Everything has been discovered and that you are in possession of it are very low. Hence the need to keep a distribution of likelihoods over beliefs rather than putting all your weight down 100% on some perspective that appeals to you aesthetically.
I should only “submit” to any particular belief in accordance with my assessment of its likelihood, and can never justify submitting to some belief 100%. Indeed, doing so would be a form of irrational fundamentalism.
Not necessarily, because the submitting is a means rather than the goal, and you will always never be certain. It’s important to recognize empirically how your emotions work in contrary to a Bayesian epistemology, how using its mechanisms paradoxically lead to something which is more aligned with reality. It’s not done with Bayesian epistemology, it is done with emotions, that do not speak in our language and it’s possibly hard-wired to be that way. So we become aware of it and mix in the inductive reasoning.
For example, the idea that you and your environment are not separate from each other may be true in some narrow technical sense but it is also very much false in probably more relevant senses.
“true in some narrow technical sense” yet “false in probably more relevant senses” this is called cognitive dissonance, empirically it can even be this way by some basic reasoning, both emotionally and factually, which is what I am talking about, and which needs to be investigated. You’re proving my point :)
That’s not true if the other I’m helping is trying to murder me—and if I can refute the generality with one example that I came up with in half a second of thought, it’s not a very useful generality.
That’s simply semantics, the problem is attaching emotionally to a sense of “I”, which is not aligned with reality, independent of action, you may speak of this practical body, hands, I, for communication, it all arises in your neural activity without a center and it’s ever changing. Empirically, that arises in the subjective reference frame, which is taken as a premise for this conversation.
I suspect that you haven’t read through all of Eliezer’s blog posts. His writings cover all the things you’re talking about, but do it in a way that is grounded in much sturdier foundations than you appear to be using.
Yes. Unsure if his writings cover what I am talking about since evident by what you’ve said so far. Not that I blame you, I just want us to meta observe ourselves so we can be more aligned.
It also seems that you are very much in love with this idea of Logic as being the One Final Solution to Everything, and that is always a huge danger sign in human thinking. Just thinking probablistically, the odds that the true Final Solution to Everything has been discovered and that you are in possession of it are very low. Hence the need to keep a distribution of likelihoods over beliefs rather than putting all your weight down 100% on some perspective that appeals to you aesthetically.
I’m unsure what considers as danger sign in human thinking if you change perspective, the likelihood that something is worse than what we have is low. You only need a limited emotional connection to science and rationality to realize this and how bad thinking spreads epidemically now, but from someone like us, it’s more likely to be good thinking? The likelihood to investigate this is very high to be positive expected value because inherently you, I and more possess the qualities which are not aligned with reality. I want to reassure you of something, however.
Alignment with reality is the most probable to give equilibrium as it’s aligned with the utility function. When in a death spiral and not aligned (yet think is aligned) then aligning with reality might seem as not aligning (“very much false in probably more relevant senses”) but the opposite and that it would be against utility function and lead to experience opposite to before. That’s the case, but if you are honest with your emotions, the experience which is baseline has a hard time to see beyond itself. That’s why understanding that experience is a tool, not a goal, although it gives to what would be considered a “satisfaction of that goal”, it is only by accepting facts that it happens, and it can’t happen in the death spiral.
I’m unsure if this is possible to communicate with words, this is quite a limitation of language and it seems as regardless what I say to you, you cannot see beyond it. That’s why I want to start a discussion of how we should be more aligned with reality and where to start from. Whether it be neuroscience studies or whatever.
It’s important to recognize empirically how your emotions work in contrary to a Bayesian epistemology, how using its mechanisms paradoxically lead to something which is more aligned with reality. It’s not done with Bayesian epistemology, it is done with emotions, that do not speak in our language and it’s possibly hard-wired to be that way. So we become aware of it and mix in the inductive reasoning.
Science does not actually know how emotions work to the degree of accuracy you are implying. Your statement that using emotional commitment rather than Bayesian epistemology leads to better alignment with reality is a hypothesis that you believe, not a fact that has been proven. If you become a very successful person by following the prescription you advocate, that would be evidence in favor of your hypothesis, but even that would not be very strong evidence by itself.
“true in some narrow technical sense” yet “false in probably more relevant senses” this is called cognitive dissonance, empirically it can even be this way by some basic reasoning, both emotionally and factually, which is what I am talking about, and which needs to be investigated. You’re proving my point :)
I am not sure what you’re saying here. “Cognitive dissonance” is not the same thing as observing that a phenomenon can be framed in two different mutually contradictory ways. I do not have an experience of dissonance when I say, “From one point of view we’re inseparable from the universe, from a different point of view we can be considered independent agents.” These are merely different interpretative paradigms and neither are right or wrong.
Yes. Unsure if his writings cover what I am talking about since evident by what you’ve said so far. Not that I blame you, I just want us to meta observe ourselves so we can be more aligned.
I am trying to say nicely that Eliezer’s writings comprehensively invalidate what you’re saying. The reason you’re getting pushback from Less Wrong is that we collectively see the mistakes that you’re making because we have a shared bag of epistemic tools that are superior to yours, not because you have access to powerful knowledge and insights that we don’t have. You would really benefit in a lot of ways from reading the essays I linked before you continue proselytizing on Less Wrong. We would love to have you as a member of the community, but in order to really join the community you will need to be willing to criticize yourself and your own ideas with detachment and rigor.
I’m unsure what considers as danger sign in human thinking if you change perspective, the likelihood that something is worse than what we have is low. You only need a limited emotional connection to science and rationality to realize this and how bad thinking spreads epidemically now, but from someone like us, it’s more likely to be good thinking? The likelihood to investigate this is very high to be positive expected value because inherently you, I and more possess the qualities which are not aligned with reality. I want to reassure you of something, however.
I’m not arguing that changing perspective from default modes of human cognition is bad. I’m arguing that your particular brand of improved thinking is not particularly compelling, and is very far from being proven superior to what I’m already doing as a committed rationalist.
Alignment with reality is the most probable to give equilibrium as it’s aligned with the utility function. When in a death spiral and not aligned (yet think is aligned) then aligning with reality might seem as not aligning (“very much false in probably more relevant senses”) but the opposite and that it would be against utility function and lead to experience opposite to before. That’s the case, but if you are honest with your emotions, the experience which is baseline has a hard time to see beyond itself. That’s why understanding that experience is a tool, not a goal, although it gives to what would be considered a “satisfaction of that goal”, it is only by accepting facts that it happens, and it can’t happen in the death spiral.
I would actually suggest that you stop using the phrase “aligning with reality” because it does not seem to convey the meaning you want it to convey. I think you should replace every instance of that phrase with the concrete substance of what you actually mean. You may find that it means essentially nothing and it just a verbal/cognitive placeholder that you’re using to prop up unclear thinking. For example, in the above paragraph, “Alignment with reality is the most probable to give equilibrium as it’s aligned with the utility function” could be rewritten as “Performing the actions most likely to yield highest utility is most probable to be aligned with the utility function”, which is a tautology, not an insight.
Science does not actually know how emotions work to the degree of accuracy you are implying. Your statement that using emotional commitment rather than Bayesian epistemology leads to better alignment with reality is a hypothesis that you believe, not a fact that has been proven. If you become a very successful person by following the prescription you advocate, that would be evidence in favor of your hypothesis, but even that would not be very strong evidence by itself.
I don’t know, that’s why I wanted to raise an investigation into it, but empirically you can validate or invalidate the hypothesis by emotional awareness, which is what I said at the start of my message you quoted and somehow make me seem to imply science when I say empirically.
First sentence: “It’s important to recognize empirically”
I do not have an experience of dissonance when I say,
You might’ve had, but no longer. That’s how cognitive dissonance works.
“From one point of view we’re inseparable from the universe, from a different point of view we can be considered independent agents.” These are merely different interpretative paradigms and neither are right or wrong.
Independent agents is an empirical observation which I have already taken as a premise as a matter of communication. Emotionally you don’t have to be an independent agent of the universe if you emotionally choose to. It’s a question whether one alignment is more aligned with reality based on factual evidence or what you feel (been conditioned). Right or wrong is a question of absolutes. More aligned overtime is not.
you will need to be willing to criticize yourself and your own ideas with detachment and rigor.
I’m unsure what it is I have not written which has not tried to communicate this message, in case you don’t understand, that’s exactly what I am trying to tell you. I am offering to raise a discussion to figure out how to do it. Aligning with reality implies detachment from things which are not aligned. If you wonder if attachment to it is possible, yeah as a means, but you’ll soon get over it by empirical and scientific evidence.
I’m not arguing that changing perspective from default modes of human cognition is bad. I’m arguing that your particular brand of improved thinking is not particularly compelling, and is very far from being proven superior to what I’m already doing as a committed rationalist.
I’m not sure, that’s why I want to raise a discussion or a study group to investigate this idea.
“Performing the actions most likely to yield highest utility is most probable to be aligned with the utility function”,
Simply being aligned with reality gives you equilibrium as that’s what you were designed to do. Using Occam’s razor here simplifies your programming.
The bottom line is being able to accept facts emotionally (such as neural activity before) rather than relying on empirical observations of social conditioning. I’m unsure that you’ve in any way disproved my point I just made.
That’s the point I want to bring, we should want to investigate that further and how we can align ourselves with the facts emotionally (empirically). But how do we do it?
Simply by saying it like this “true in some narrow technical sense” then “false in probably more relevant senses” so your empirical observation is probably “true” rather than scientific evidence, or facts? (which you call narrow and technical), no it’s not probably true and there is a disconnect between your emotional attachments to what’s less probable to what’s more probable. You don’t even see it as a problem because it’s your lens, yet you have to do your best to admit it in a way where it doesn’t seem too obvious by using words like “narrow”. That’s exactly what I invite you to discuss further, why are you believing things to be false, when the scientific evidence says otherwise? (“true in some narrow technnical sense”) I presume you’re also using true and false in a linguistic way, there’s no such thing.
That’s exactly why I deem it important, because if you did, you’d say “yeah the scientific evidence says so” instead of “no my senses tells me it’s false” or both (which makes no sense, worth to investigate!), what if by learning of the scientific evidence, you adopt the “truth” so that your senses tell you what is “true”? That’s what you would do.
Simply by saying it like this “true in some narrow technical sense” then “false in probably more relevant senses” so your empirical observation is probably “true” rather than scientific evidence, or facts? (which you call narrow and technical), no it’s not probably true and there is a disconnect between your emotional attachments to what’s less probable to what’s more probable. You don’t even see it as a problem because it’s your lens, yet you have to do your best to admit it in a way where it doesn’t seem too obvious by using words like “narrow”. That’s exactly what I invite you to discuss further, why are you believing things to be false, when the scientific evidence says otherwise? (“true in some narrow technnical sense”) I presume you’re also using true and false in a linguistic way, there’s no such thing.
There is a narrow technical sense in which my actions are dependent on the gravitational pull of some particular atom in a random star in a distant galaxy. That atom is having a physical effect on me. This is true and indisputable.
In a more relevant sense, that atom is not having any effect on me that I should bother with considering. If a magical genie intervened and screened off the gravitational field of that atom, it would change none of my choices in any way that could be observed.
What am I supposedly believing that is false, that is contradicted by science? What specific scientific findings are you implying that I have got wrong?
...
Let me back way up.
You are saying a lot of really uncontroversial things that nobody here particularly cares to argue about, like “Occam’s razor is good” and “we are not causally separate from the universe at large” and “living life as a human requires a constant balancing and negotiation between the emotional/sensing/feeling and rational/deliberative/calculating parts of the human mind”. These ideas are all old hat around here. They go all the way back to Eliezer’s original essays, and he got those ideas from much older sources.
Then you’re jumping forward and making quasi-religious statements about “aligning with reality” and “emotionally submitting” and talking about how your “sense of self disappears”. All that stuff is your own unsupported extrapolations. This is the reason you’re having trouble communicating here.
What am I supposedly believing that is false, that is contradicted by science? What specific scientific findings are you implying that I have got wrong?
This is what you said:
“For example, the idea that you and your environment are not separate from each other may be true in some narrow technical sense but it is also very much false in probably more relevant senses.”
You’re believing that you and your environment are separate based on “relevant” senses. Scientific evidence is irrelevant to your some of your senses, it is technical. If all of your senses were in resonance, including emotional, then there wouldn’t be such a thing where scientific evidence is irrelevant in this context.
So your environment and you are not separate. This is a scientific fact. Because it’s all a part of your neural activity. Now I am not denying consciousness, qualia or empirical evidence. I’m already taking it as a premise. But you are emotionally attached to the idea that you and environment are separate, that’s why you’re unable to accept the scientific evidence. However, if you had a scientific mindset, facts would make you accept it. It’s not in the way you think right now “It’s true in a technical sense, but not for the relevant senses”, whereas one part of you accept it but the other, your emotions, do not.
Exactly this is what I am explaining by aligning with reality, you’re aligning and letting the evidence in rather than rejecting from preconditioned beliefs. I think you’re starting to understand and that you will be stronger because of it. Even if it might seem a little scary at start. Of course we have to investigate it.
There is a narrow technical sense in which my actions are dependent on the gravitational pull of some particular atom in a random star in a distant galaxy. That atom is having a physical effect on me. This is true and indisputable.
In a more relevant sense, that atom is not having any effect on me that I should bother with considering. If a magical genie intervened and screened off the gravitational field of that atom, it would change none of my choices in any way that could be observed.
You don’t bother considering because it’s an analogy in which the hypothetical scenario leads to that conclusion. Do the same with the statements in context, repeat it, is it having any effect on you that you feel that you’re not separate from your environment (“Helping others is helping you?”) and so on? But of course you have to write down in the same manner, but now not for an analogy.
Then you’re jumping forward and making quasi-religious statements about “aligning with reality” and “emotionally submitting” and talking about how your “sense of self disappears”. All that stuff is your own unsupported extrapolations. This is the reason you’re having trouble communicating here.
Aligning with reality is an emotional heuristic which follows Occam’s razor. Emotionally submitting, you already do. That’s an example of if you emotionally submit to a heuristic which constantly aligns you to reality and acts as a guide to your decisions. Then if there is evidence, like I’ve written in the start of the post, you submit yourself to the extent where it’s no longer in “a technical sense”.
But you are emotionally attached to the idea that you and environment are separate, that’s why you’re unable to accept the scientific evidence.
No, I’m not.
This is just not a very interesting or useful line of thinking. I (and most people on this forum) already try to live as rationalists, and where your proposal implies any deviation in from that framework, your deviations are inferior to simply doing what we are already doing. Furthermore, you consistently rely on buzzwords of your own invention (“aligning with reality”, “emotionally submitting”) which greatly inhibit your attempts at clarifying what you’re trying to say. Perhaps if you read the essays as I suggest, you could provide substantive criticisms/improvements that did not rely on your own idiosyncratic terminology.
You say you’re not, yet you’re contradicting your previous statement where scientific facts are irrelevant to your other senses [emotions]. Which you completely omitted in responding to. Please explain. Is it a blind spot?
This is just not a very interesting or useful line of thinking.
I’m unsure why accepting facts to the extent where falsehoods by other senses are overwritten, is uninteresting or not useful.
I (and most people on this forum) already try to live as rationalists, and where your proposal implies any deviation in from that framework, your deviations are inferior to simply doing what we are already doing.
It’s obviously not inferior or superior as I’ve already explained a flaw in your reasoning, which you’re either already too much of an affective death spiral to notice, or completely omitting because you have some vague sense that you are right. You could’ve welcomed me rather than prove to me what I’ve been saying all along. :)
Furthermore, you consistently rely on buzzwords of your own invention (“aligning with reality”, “emotionally submitting”) which greatly inhibit your attempts at clarifying what you’re trying to say.
It’s very explanatory. If you go against what you are and your purpose then you are not aligned with reality. If you go alongside with what you are and your purpose then you are aligned with reality. Accepting facts in all senses, including emotionally. By everything I’ve written so far, it should able to connect the dots with your pattern-recognition machine what these ’buzzword’s mean? If I say X means this, this that, multiple times then you should have a vague sense in what I mean it?
Perhaps if you read the essays as I suggest, you could provide substantive criticisms/improvements that did not rely on your own idiosyncratic terminology.
I wasn’t using ‘my terminology’ when I explained your contradiction, and that this contradiction is the problem?
“For example, the idea that you and your environment are not separate from each other may be true in some narrow technical sense but it is also very much false in probably more relevant senses.”
.
You’re believing that you and your environment are separate based on “relevant” senses. Scientific evidence is irrelevant to your some of your senses, it is technical. If all of your senses were in resonance, including emotional, then there wouldn’t be such a thing where scientific evidence is irrelevant in this context.
You say you’re not, yet you’re contradicting your previous statement where scientific facts are irrelevant to your other senses [emotions].
Where did I say scientific facts are irrelevant to my emotions?
It’s obviously not inferior or superior as I’ve already explained a flaw in your reasoning, which you’re either already too much of an affective death spiral to notice, or completely omitting because you have some vague sense that you are right.
Please remind me or re-highlight where this flaw/contradiction happened. I did not notice you pointing it out before and cannot ascertain what you’re referring to.
By everything I’ve written so far, it should able to connect the dots with your pattern-recognition machine what these ’buzzword’s mean? If I say X means this, this that, multiple times then you should have a vague sense in what I mean it?
I have an idea of what you’re trying to say, but I suspect that you don’t. Your thinking is not clear. By using different words, you will force yourself to interrogate your own understanding of what you’re putting forth.
You’re believing that you and your environment are separate based on “relevant” senses. Scientific evidence is irrelevant to your some of your senses, it is technical. If all of your senses were in resonance, including emotional, then there wouldn’t be such a thing where scientific evidence is irrelevant in this context.
Is this what you’re talking about where you say I’m making an error in reasoning? If so it seems like you just misunderstood me. The gravitational pull of a distant atom is causally present but practically irrelevant to any conceivable choice that I make. This is not a statement that I feel is particularly controversial. It is obviously true.
“For example, the idea that you and your environment are not separate from each other may be true in some narrow technical sense”
In a technical sense.
“but it is also very much false in probably more relevant senses.”
The relevant sense here is your emotions.
Technically you understand that self and environment is one and the same, but you don’t emotionally resonate with that idea [you don’t emotionally resonate with facts].
Otherwise, what do you mean with:
“For example, the idea that you and your environment are not separate from each other may be true in some narrow technical sense”It’s true...?
“but it is also very much false in probably more relevant senses.”But it’s false… for a relevant sense?
What is the ‘relevant sense’? (not emotions?)
Is it more or less probable that ‘you and your environment’ is separated and based on what evidence?
I have an idea of what you’re trying to say, but I suspect that you don’t. Your thinking is not clear. By using different words, you will force yourself to interrogate your own understanding of what you’re putting forth.
Emotionally accepting or submitting to something is an empirical fact. There are no different words, but if there is, you’re free to put them forward.
The gravitational pull of a distant atom is causally present but practically irrelevant to any conceivable choice that I make. This is not a statement that I feel is particularly controversial. It is obviously true.
You keep using analogies rather than the example you gave earlier. Why? I already understand what you mean, but the actual example is not irrelevant to your decisions.
So what you actually meant was:
“You and your environment are not separated. This is obviously true”?
Can you confirm? Please spot the dissonance and be honest.
You’re reading way too much into word choice things and projecting onto me a mentality that I don’t hold.
“You and your environment are not separated. This is obviously true”?
Can you confirm? Please spot the dissonance and be honest.
Indeed, that was what I said. It is still true.
The gravitational pull of a distant atom is causally present but practically irrelevant to any conceivable choice that I make.
This is also true. Whether or not that particular atom is there or is magically whisked away, it’s not going to change where I decide to eat lunch today. The activity of that atom is not relevant to my decision making process.
That’s it. What part of this is supposed to be in error?
Indeed, this is true in the sense that it’s most likely that this is the case based on the available evidence.
I’m glad that you’re aligned with reality on this certain point, there’s not many that are, but I wonder, why do you claim that helping others is not helping yourself, excluding practicality of semantics? It seemed as you were very new to the concept of non-emotional attachment to identity/I because you argued my semantics.
But, you claimed earlier that none of this is actually factual would you like to elaborate on that? That these are my interpretations of vague and difficult-to-pin-down philosophical ideas.
The reason why I push this is because you contradict yourself and you very much seemed to have an opinion on this specific matter.
I feel it might help you to know that none of this is actually factual. These are your interpretations of really vague and difficult-to-pin-down philosophical ideas, ideas about which very smart and well-read people can and do disagree.
For example, the idea that you and your environment are not separate from each other may be true in some narrow technical sense but it is also very much false in probably more relevant senses. The same could be said for the idea that helping another is helping yourself. That’s not true if the other I’m helping is trying to murder me—and if I can refute the generality with one example that I came up with in half a second of thought, it’s not a very useful generality.
So… “none of this is actually factual”, it’s philosophical ideas, but later on you agree that “you and your environment are not separated. This is obviously true” by saying “Indeed, that was what I said. It is still true.” Which you did but it was ”...in some narrow technical sense...” and ”...but it is also very much false … relevant …” now it’s “It’s true” “factual”? Is it also a “philosophical idea” and a part of the ideas that “none of this is actually factual”?
Your statements in order:
not actually factual.
really vague philosophical ideas
may be true in some narrow technical sense
but it is also very much false in probably more relevant senses
indeed, that what was I said
it is still true
It’s fine to be wrong and correct yourself :)
The activity of that atom is not relevant to my decision making process.
That’s it. What part of this is supposed to be in error?
Yeah, it isn’t, but the example you gave of you and environment, is relevant to your decision-making process, as evident by your claim (outside of practicality) and of semantics that “helping others is not helping yourself” for example. So using an analogy which is not relevant to your decision-making process in contrary to your example where it is, is incorrect. That’s why I say use the example which you used before. Instead of making an analogy that I don’t disagree with.
It seemed as you were very new to the concept of non-emotional attachment to identity/I because you argued my semantics.
Not really, I’ve been practicing various forms of Buddhist meditation for several years and have pretty low attachment to my identity. This is substantially different from saying with any kind of certainty that helping other people is identical to helping myself. Other people want things contrary to what I want. I am not helping myself if I help them. Having low attachment to my identity is not the same thing as being okay with people hurting or killing me.
The rest of your post, which I’m not going to quote, is just mixing up lots of different things. I’m not sure if you’re not aware of it or if you are aware of it and you’re trying to obfuscate this discussion, but I will give you the benefit of the doubt.
I will untangle the mess. You said:
For example, neuroscience will tell you, that you and your environment are not separate from each other, it’s all a part of your neural activity. So helping another is helping you. If that doesn’t resonate enough, for example, evolutionary biology that we’re all descendants from stardust might. Or that there is a probability that you don’t exist (as per QM) although very small. So what happens? Your identity and self vanishes, as it’s no longer aligned with reality, you accept facts, emotionally.
Then I said,
I feel it might help you to know that none of this is actually factual. These are your interpretations of really vague and difficult-to-pin-down philosophical ideas, ideas about which very smart and well-read people can and do disagree. For example, the idea that you and your environment are not separate from each other may be true in some narrow technical sense but it is also very much false in probably more relevant senses. The same could be said for the idea that helping another is helping yourself. That’s not true if the other I’m helping is trying to murder me—and if I can refute the generality with one example that I came up with in half a second of thought, it’s not a very useful generality.
Since I have now grasped the source of your confusion with my word choice, I will reengage. You specifically say:
For example, neuroscience will tell you, that you and your environment are not separate from each other, it’s all a part of your neural activity. So helping another is helping you.
This is a pure non sequitur. The fact that human brains run on physics in no way implies that helping another is helping yourself. Again, if a person wants to kill me, I’m not helping myself if I hand him a gun. If you model human agents the way Dennis Hoffman’s character does in I Heart Huckabees you’re going to end up repeatedly confused and stymied by reality.
So what happens? Your identity and self vanishes, as it’s no longer aligned with reality, you accept facts, emotionally.
This is also just not factual. You’re making an outlandish and totally unsupported claim when you say that “emotionally accepting reality” causes the annihilation of the self. The only known things that can make the identity and self vanish are
high dose psychotropic compounds
extremely long and intense meditation of particular forms that do not look much like what you’re talking about
and even these are only true for certain circumscribed senses of the word “self”.
So let’s review:
I don’t object to the naturalistic philosophy that you seem to enjoy. That’s all cool and good. We’re all about naturalistic science around here. The problem is statements like
So helping another is helping you.
and
Your identity and self vanishes, as it’s no longer aligned with reality.
These are pseudo-religious woo, not supported by science anywhere. I have given you very simple examples of scenarios where they are flatly false, which immediately proves that they are not the powerful general truths you seem to think they are.
This is substantially different from saying with any kind of certainty that helping other people is identical to helping myself.
No, it’s not.
Other people want things contrary to what I want.
What does that have to do with helping yourself, thus other people?
Having low attachment to my identity is not the same thing as being okay with people hurting or killing me.
Yeah, but ‘me’ is used practically.
The fact that human brains run on physics in no way implies that helping another is helping yourself.
I said your neural activity includes you and your environment and that there is no differentiation. So there is no differentiation by helping another as in helping yourself.
Again, if a person wants to kill me, I’m not helping myself if I hand him a gun. If you model human agents the way Dennis Hoffman’s character does in I Heart Huckabees you’re going to end up repeatedly confused and stymied by reality.
That’s the practical ‘myself’ to talk about this body, its requirements and so on. You are helping yourself by not giving him a gun because you are not differentiated by your environment. You are presuming that you are helping yourself by giving gun because you think that there is another. No there is only yourself. You help yourself by not giving the gun because your practical ‘myself’ is included in ‘yourself’.
This is also just not factual. You’re making an outlandish and totally unsupported claim when you say that “emotionally accepting reality” causes the annihilation of the self. The only known things that can make the identity and self vanish are
high dose psychotropic compounds
extremely long and intense meditation of particular forms that do not look much like what you’re talking about
and even these are only true for certain circumscribed senses of the word “self”.
I don’t deny that it is not that factual as there is limited objective evidence.
These are pseudo-religious woo, not supported by science anywhere. I have given you very simple examples of scenarios where they are flatly false, which immediately proves that they are not the powerful general truths you seem to think they are.
I disagree with ‘helping another is helping you’ being psuedo-religious woo but it’s because we’re talking about semantics. We have to decide what ‘me’ or my ‘self’ or ‘I’ is. I use the neural activity as the definition of this. You seem to use some type philosophical reasoning where you are presuming I use the same definition.
So we should investigate if your self and identity can die from that and if other facts which we don’t embrace emotionally leads to a similar process but for their area. That’s the entire point of my original post.
It doesn’t look like there’s anywhere to go from here. It looks like you are acknowledging that where your positions are strong, they are not novel, and where they are novel, they are not strong. If you enjoy drawing the boundaries of your self in unusual places or emotionally associating your identity with certain ideas, go for it. Just don’t expect anybody else to find those ideas compelling without evidence.
These are the steps I did to have identity death: link to steps I also meditated on the 48 min hypnosis track youtube If you are interested in where I got my ideas from and if you want to try it yourself. It’s of course up to you but you have a strong identity and ego issues and I think it will help “you”(and me).
Yeah, it’s also called ‘Enlightenment’ in theological traditions. You can read the testimonies here. MrMind has, for example, read them, but he’s waiting a bit longer to contact these people on Reddit to see if it sticks around. I think the audio can work really well with a good pair of headphones and playing it as FLAC.
A scientific mindset has a lower probability of being positive expected value because there is more than one value when it comes to making decisions, sometimes in conflict with each other. This can lead to cognitive dissonance in daily life. It’s because science is a tool, the best one we got. Aligning with reality has a higher probability as it’s an emotional heuristic, with only one value necessary.
Aligning with reality means submitting yourself emotionally, similar to how a religious person submits to God, but in this case, our true creator: To logic, where it is defined here as “the consistent patterns which bring about reality”. Then you accept facts fully. You understand how everything is probabilities, as per one interpretation of quantum mechanics and that experience is a tool rather than a goal. Using inductive reasoning and deciding actions as per positive expected value allows you to accept facts and be aligned with reality.
It’s hard if you keep thinking binary, whether it be absolutes or not, 1′s or 0′s. Because to be able to accept facts it to be able to accept one might be wrong, everything is probabilities, infinite possibilities. Practically, if you know exercising every day is positive expected value, for example, then as you align yourself with reality in every moment, you realize even if you injure yourself accidentally today, you won’t give up reality. Because you made the most efficient action as per your knowledge and you already accounted for the probability of accidentally injuring yourself.
So as you keep feeling you also upgrade it with the probabilities to keep your emotions aligned with reality and easier able to handle situations as I mentioned above, however, maybe something more specific if someone breaks your trust. You already took it in consideration so you won’t completely lose trust and emotions for reality.
When you accept and align yourself with reality, then the facts which underlie it, with our current understandings and as long as the likelihood is high, you keep aligning yourself. Experience truly is a feedback loop which results in whatever you feed it.
Regarding what aligning with reality entails: When you’re constantly aligning yourself to reality, as long as you deem the probability high you’ll be able to emotionally resonate with insights gained. For example, neuroscience will tell you, that you and your environment are not separate from each other, it’s all a part of your neural activity. So helping another is helping you. If that doesn’t resonate enough, for example, evolutionary biology that we’re all descendants from stardust might. Or that there is a probability that you don’t exist (as per QM) although very small. So what happens? Your identity and self vanishes, as it’s no longer aligned with reality, you accept facts, emotionally. Then you keep the momentum by doing logical actions as per positive expected value after you learn everything what truly is you, and so on.
It’s about what Einstein believed in and Carl Sagan, Spinoza’s. However Einstein couldn’t accept QM because he was thinking in absolutes already, and was unaware of how the brain works. Which we do now, for example, know we’re all inherently in denial, and how memory storage works, etc. If he knew that he might have had a different view.
I can’t really fix up this text right now but I hope it can somehow help for you to understand what it means to align with reality. It’s really important to accept that experience is a tool, not a goal, from insights from evolutionary biology for example. Then there is reality. Who is aligning, if there is only reality?
I think there is an irreconcilable tension between your statement that one should completely emotionally submit to and align with facts, and that one should use a Bayesian epistemology to manage beliefs.
There are many things in life and in science that I’m very certain about, but by the laws of probability I can never be 100% certain. There are many more things that I am less than certain about, and hold a cloud of possible explanations, the most likely of which may only be 20% probable in my estimation. I should only “submit” to any particular belief in accordance with my assessment of its likelihood, and can never justify submitting to some belief 100%. Indeed, doing so would be a form of irrational fundamentalism.
I feel it might help you to know that none of this is actually factual. These are your interpretations of really vague and difficult-to-pin-down philosophical ideas, ideas about which very smart and well-read people can and do disagree.
For example, the idea that you and your environment are not separate from each other may be true in some narrow technical sense but it is also very much false in probably more relevant senses. The same could be said for the idea that helping another is helping yourself. That’s not true if the other I’m helping is trying to murder me—and if I can refute the generality with one example that I came up with in half a second of thought, it’s not a very useful generality.
I suspect that you haven’t read through all of Eliezer’s blog posts. His writings cover all the things you’re talking about, but do it in a way that is grounded in much sturdier foundations than you appear to be using. It also seems that you are very much in love with this idea of Logic as being the One Final Solution to Everything, and that is always a huge danger sign in human thinking. Just thinking probablistically, the odds that the true Final Solution to Everything has been discovered and that you are in possession of it are very low. Hence the need to keep a distribution of likelihoods over beliefs rather than putting all your weight down 100% on some perspective that appeals to you aesthetically.
Not necessarily, because the submitting is a means rather than the goal, and you will always never be certain. It’s important to recognize empirically how your emotions work in contrary to a Bayesian epistemology, how using its mechanisms paradoxically lead to something which is more aligned with reality. It’s not done with Bayesian epistemology, it is done with emotions, that do not speak in our language and it’s possibly hard-wired to be that way. So we become aware of it and mix in the inductive reasoning.
“true in some narrow technical sense” yet “false in probably more relevant senses” this is called cognitive dissonance, empirically it can even be this way by some basic reasoning, both emotionally and factually, which is what I am talking about, and which needs to be investigated. You’re proving my point :)
That’s simply semantics, the problem is attaching emotionally to a sense of “I”, which is not aligned with reality, independent of action, you may speak of this practical body, hands, I, for communication, it all arises in your neural activity without a center and it’s ever changing. Empirically, that arises in the subjective reference frame, which is taken as a premise for this conversation.
Yes. Unsure if his writings cover what I am talking about since evident by what you’ve said so far. Not that I blame you, I just want us to meta observe ourselves so we can be more aligned.
I’m unsure what considers as danger sign in human thinking if you change perspective, the likelihood that something is worse than what we have is low. You only need a limited emotional connection to science and rationality to realize this and how bad thinking spreads epidemically now, but from someone like us, it’s more likely to be good thinking? The likelihood to investigate this is very high to be positive expected value because inherently you, I and more possess the qualities which are not aligned with reality. I want to reassure you of something, however.
Alignment with reality is the most probable to give equilibrium as it’s aligned with the utility function. When in a death spiral and not aligned (yet think is aligned) then aligning with reality might seem as not aligning (“very much false in probably more relevant senses”) but the opposite and that it would be against utility function and lead to experience opposite to before. That’s the case, but if you are honest with your emotions, the experience which is baseline has a hard time to see beyond itself. That’s why understanding that experience is a tool, not a goal, although it gives to what would be considered a “satisfaction of that goal”, it is only by accepting facts that it happens, and it can’t happen in the death spiral.
I’m unsure if this is possible to communicate with words, this is quite a limitation of language and it seems as regardless what I say to you, you cannot see beyond it. That’s why I want to start a discussion of how we should be more aligned with reality and where to start from. Whether it be neuroscience studies or whatever.
Science does not actually know how emotions work to the degree of accuracy you are implying. Your statement that using emotional commitment rather than Bayesian epistemology leads to better alignment with reality is a hypothesis that you believe, not a fact that has been proven. If you become a very successful person by following the prescription you advocate, that would be evidence in favor of your hypothesis, but even that would not be very strong evidence by itself.
I am not sure what you’re saying here. “Cognitive dissonance” is not the same thing as observing that a phenomenon can be framed in two different mutually contradictory ways. I do not have an experience of dissonance when I say, “From one point of view we’re inseparable from the universe, from a different point of view we can be considered independent agents.” These are merely different interpretative paradigms and neither are right or wrong.
I am trying to say nicely that Eliezer’s writings comprehensively invalidate what you’re saying. The reason you’re getting pushback from Less Wrong is that we collectively see the mistakes that you’re making because we have a shared bag of epistemic tools that are superior to yours, not because you have access to powerful knowledge and insights that we don’t have. You would really benefit in a lot of ways from reading the essays I linked before you continue proselytizing on Less Wrong. We would love to have you as a member of the community, but in order to really join the community you will need to be willing to criticize yourself and your own ideas with detachment and rigor.
I’m not arguing that changing perspective from default modes of human cognition is bad. I’m arguing that your particular brand of improved thinking is not particularly compelling, and is very far from being proven superior to what I’m already doing as a committed rationalist.
I would actually suggest that you stop using the phrase “aligning with reality” because it does not seem to convey the meaning you want it to convey. I think you should replace every instance of that phrase with the concrete substance of what you actually mean. You may find that it means essentially nothing and it just a verbal/cognitive placeholder that you’re using to prop up unclear thinking. For example, in the above paragraph, “Alignment with reality is the most probable to give equilibrium as it’s aligned with the utility function” could be rewritten as “Performing the actions most likely to yield highest utility is most probable to be aligned with the utility function”, which is a tautology, not an insight.
I don’t know, that’s why I wanted to raise an investigation into it, but empirically you can validate or invalidate the hypothesis by emotional awareness, which is what I said at the start of my message you quoted and somehow make me seem to imply science when I say empirically.
First sentence: “It’s important to recognize empirically”
You might’ve had, but no longer. That’s how cognitive dissonance works.
Independent agents is an empirical observation which I have already taken as a premise as a matter of communication. Emotionally you don’t have to be an independent agent of the universe if you emotionally choose to. It’s a question whether one alignment is more aligned with reality based on factual evidence or what you feel (been conditioned). Right or wrong is a question of absolutes. More aligned overtime is not.
I’m unsure what it is I have not written which has not tried to communicate this message, in case you don’t understand, that’s exactly what I am trying to tell you. I am offering to raise a discussion to figure out how to do it. Aligning with reality implies detachment from things which are not aligned. If you wonder if attachment to it is possible, yeah as a means, but you’ll soon get over it by empirical and scientific evidence.
I’m not sure, that’s why I want to raise a discussion or a study group to investigate this idea.
Simply being aligned with reality gives you equilibrium as that’s what you were designed to do. Using Occam’s razor here simplifies your programming.
The bottom line is being able to accept facts emotionally (such as neural activity before) rather than relying on empirical observations of social conditioning. I’m unsure that you’ve in any way disproved my point I just made.
That’s the point I want to bring, we should want to investigate that further and how we can align ourselves with the facts emotionally (empirically). But how do we do it?
Simply by saying it like this “true in some narrow technical sense” then “false in probably more relevant senses” so your empirical observation is probably “true” rather than scientific evidence, or facts? (which you call narrow and technical), no it’s not probably true and there is a disconnect between your emotional attachments to what’s less probable to what’s more probable. You don’t even see it as a problem because it’s your lens, yet you have to do your best to admit it in a way where it doesn’t seem too obvious by using words like “narrow”. That’s exactly what I invite you to discuss further, why are you believing things to be false, when the scientific evidence says otherwise? (“true in some narrow technnical sense”) I presume you’re also using true and false in a linguistic way, there’s no such thing.
That’s exactly why I deem it important, because if you did, you’d say “yeah the scientific evidence says so” instead of “no my senses tells me it’s false” or both (which makes no sense, worth to investigate!), what if by learning of the scientific evidence, you adopt the “truth” so that your senses tell you what is “true”? That’s what you would do.
There is a narrow technical sense in which my actions are dependent on the gravitational pull of some particular atom in a random star in a distant galaxy. That atom is having a physical effect on me. This is true and indisputable.
In a more relevant sense, that atom is not having any effect on me that I should bother with considering. If a magical genie intervened and screened off the gravitational field of that atom, it would change none of my choices in any way that could be observed.
What am I supposedly believing that is false, that is contradicted by science? What specific scientific findings are you implying that I have got wrong?
...
Let me back way up.
You are saying a lot of really uncontroversial things that nobody here particularly cares to argue about, like “Occam’s razor is good” and “we are not causally separate from the universe at large” and “living life as a human requires a constant balancing and negotiation between the emotional/sensing/feeling and rational/deliberative/calculating parts of the human mind”. These ideas are all old hat around here. They go all the way back to Eliezer’s original essays, and he got those ideas from much older sources.
Then you’re jumping forward and making quasi-religious statements about “aligning with reality” and “emotionally submitting” and talking about how your “sense of self disappears”. All that stuff is your own unsupported extrapolations. This is the reason you’re having trouble communicating here.
This is what you said:
“For example, the idea that you and your environment are not separate from each other may be true in some narrow technical sense but it is also very much false in probably more relevant senses.”
You’re believing that you and your environment are separate based on “relevant” senses. Scientific evidence is irrelevant to your some of your senses, it is technical. If all of your senses were in resonance, including emotional, then there wouldn’t be such a thing where scientific evidence is irrelevant in this context.
So your environment and you are not separate. This is a scientific fact. Because it’s all a part of your neural activity. Now I am not denying consciousness, qualia or empirical evidence. I’m already taking it as a premise. But you are emotionally attached to the idea that you and environment are separate, that’s why you’re unable to accept the scientific evidence. However, if you had a scientific mindset, facts would make you accept it. It’s not in the way you think right now “It’s true in a technical sense, but not for the relevant senses”, whereas one part of you accept it but the other, your emotions, do not.
Exactly this is what I am explaining by aligning with reality, you’re aligning and letting the evidence in rather than rejecting from preconditioned beliefs. I think you’re starting to understand and that you will be stronger because of it. Even if it might seem a little scary at start. Of course we have to investigate it.
You don’t bother considering because it’s an analogy in which the hypothetical scenario leads to that conclusion. Do the same with the statements in context, repeat it, is it having any effect on you that you feel that you’re not separate from your environment (“Helping others is helping you?”) and so on? But of course you have to write down in the same manner, but now not for an analogy.
Aligning with reality is an emotional heuristic which follows Occam’s razor. Emotionally submitting, you already do. That’s an example of if you emotionally submit to a heuristic which constantly aligns you to reality and acts as a guide to your decisions. Then if there is evidence, like I’ve written in the start of the post, you submit yourself to the extent where it’s no longer in “a technical sense”.
No, I’m not.
This is just not a very interesting or useful line of thinking. I (and most people on this forum) already try to live as rationalists, and where your proposal implies any deviation in from that framework, your deviations are inferior to simply doing what we are already doing. Furthermore, you consistently rely on buzzwords of your own invention (“aligning with reality”, “emotionally submitting”) which greatly inhibit your attempts at clarifying what you’re trying to say. Perhaps if you read the essays as I suggest, you could provide substantive criticisms/improvements that did not rely on your own idiosyncratic terminology.
You say you’re not, yet you’re contradicting your previous statement where scientific facts are irrelevant to your other senses [emotions]. Which you completely omitted in responding to. Please explain. Is it a blind spot?
I’m unsure why accepting facts to the extent where falsehoods by other senses are overwritten, is uninteresting or not useful.
It’s obviously not inferior or superior as I’ve already explained a flaw in your reasoning, which you’re either already too much of an affective death spiral to notice, or completely omitting because you have some vague sense that you are right. You could’ve welcomed me rather than prove to me what I’ve been saying all along. :)
It’s very explanatory. If you go against what you are and your purpose then you are not aligned with reality. If you go alongside with what you are and your purpose then you are aligned with reality. Accepting facts in all senses, including emotionally. By everything I’ve written so far, it should able to connect the dots with your pattern-recognition machine what these ’buzzword’s mean? If I say X means this, this that, multiple times then you should have a vague sense in what I mean it?
I wasn’t using ‘my terminology’ when I explained your contradiction, and that this contradiction is the problem?
.
That’s the improvement we have to make.
Where did I say scientific facts are irrelevant to my emotions?
Please remind me or re-highlight where this flaw/contradiction happened. I did not notice you pointing it out before and cannot ascertain what you’re referring to.
I have an idea of what you’re trying to say, but I suspect that you don’t. Your thinking is not clear. By using different words, you will force yourself to interrogate your own understanding of what you’re putting forth.
Is this what you’re talking about where you say I’m making an error in reasoning? If so it seems like you just misunderstood me. The gravitational pull of a distant atom is causally present but practically irrelevant to any conceivable choice that I make. This is not a statement that I feel is particularly controversial. It is obviously true.
“For example, the idea that you and your environment are not separate from each other may be true in some narrow technical sense”
In a technical sense.
“but it is also very much false in probably more relevant senses.”
The relevant sense here is your emotions.
Technically you understand that self and environment is one and the same, but you don’t emotionally resonate with that idea [you don’t emotionally resonate with facts].
Otherwise, what do you mean with:
“For example, the idea that you and your environment are not separate from each other may be true in some narrow technical sense” It’s true...?
“but it is also very much false in probably more relevant senses.” But it’s false… for a relevant sense?
What is the ‘relevant sense’? (not emotions?)
Is it more or less probable that ‘you and your environment’ is separated and based on what evidence?
Emotionally accepting or submitting to something is an empirical fact. There are no different words, but if there is, you’re free to put them forward.
You keep using analogies rather than the example you gave earlier. Why? I already understand what you mean, but the actual example is not irrelevant to your decisions.
So what you actually meant was:
“You and your environment are not separated. This is obviously true”?
Can you confirm? Please spot the dissonance and be honest.
Thanks, this is clarifying.
You’re reading way too much into word choice things and projecting onto me a mentality that I don’t hold.
Indeed, that was what I said. It is still true.
This is also true. Whether or not that particular atom is there or is magically whisked away, it’s not going to change where I decide to eat lunch today. The activity of that atom is not relevant to my decision making process.
That’s it. What part of this is supposed to be in error?
Indeed, this is true in the sense that it’s most likely that this is the case based on the available evidence.
I’m glad that you’re aligned with reality on this certain point, there’s not many that are, but I wonder, why do you claim that helping others is not helping yourself, excluding practicality of semantics? It seemed as you were very new to the concept of non-emotional attachment to identity/I because you argued my semantics.
But, you claimed earlier that none of this is actually factual would you like to elaborate on that? That these are my interpretations of vague and difficult-to-pin-down philosophical ideas.
The reason why I push this is because you contradict yourself and you very much seemed to have an opinion on this specific matter.
So… “none of this is actually factual”, it’s philosophical ideas, but later on you agree that “you and your environment are not separated. This is obviously true” by saying “Indeed, that was what I said. It is still true.” Which you did but it was ”...in some narrow technical sense...” and ”...but it is also very much false … relevant …” now it’s “It’s true” “factual”? Is it also a “philosophical idea” and a part of the ideas that “none of this is actually factual”?
Your statements in order:
not actually factual.
really vague philosophical ideas
may be true in some narrow technical sense
but it is also very much false in probably more relevant senses
indeed, that what was I said
it is still true
It’s fine to be wrong and correct yourself :)
Yeah, it isn’t, but the example you gave of you and environment, is relevant to your decision-making process, as evident by your claim (outside of practicality) and of semantics that “helping others is not helping yourself” for example. So using an analogy which is not relevant to your decision-making process in contrary to your example where it is, is incorrect. That’s why I say use the example which you used before. Instead of making an analogy that I don’t disagree with.
Not really, I’ve been practicing various forms of Buddhist meditation for several years and have pretty low attachment to my identity. This is substantially different from saying with any kind of certainty that helping other people is identical to helping myself. Other people want things contrary to what I want. I am not helping myself if I help them. Having low attachment to my identity is not the same thing as being okay with people hurting or killing me.
The rest of your post, which I’m not going to quote, is just mixing up lots of different things. I’m not sure if you’re not aware of it or if you are aware of it and you’re trying to obfuscate this discussion, but I will give you the benefit of the doubt.
I will untangle the mess. You said:
Then I said,
Since I have now grasped the source of your confusion with my word choice, I will reengage. You specifically say:
This is a pure non sequitur. The fact that human brains run on physics in no way implies that helping another is helping yourself. Again, if a person wants to kill me, I’m not helping myself if I hand him a gun. If you model human agents the way Dennis Hoffman’s character does in I Heart Huckabees you’re going to end up repeatedly confused and stymied by reality.
This is also just not factual. You’re making an outlandish and totally unsupported claim when you say that “emotionally accepting reality” causes the annihilation of the self. The only known things that can make the identity and self vanish are
high dose psychotropic compounds
extremely long and intense meditation of particular forms that do not look much like what you’re talking about
and even these are only true for certain circumscribed senses of the word “self”.
So let’s review:
I don’t object to the naturalistic philosophy that you seem to enjoy. That’s all cool and good. We’re all about naturalistic science around here. The problem is statements like
and
These are pseudo-religious woo, not supported by science anywhere. I have given you very simple examples of scenarios where they are flatly false, which immediately proves that they are not the powerful general truths you seem to think they are.
No, it’s not.
What does that have to do with helping yourself, thus other people?
Yeah, but ‘me’ is used practically.
I said your neural activity includes you and your environment and that there is no differentiation. So there is no differentiation by helping another as in helping yourself.
That’s the practical ‘myself’ to talk about this body, its requirements and so on. You are helping yourself by not giving him a gun because you are not differentiated by your environment. You are presuming that you are helping yourself by giving gun because you think that there is another. No there is only yourself. You help yourself by not giving the gun because your practical ‘myself’ is included in ‘yourself’.
I don’t deny that it is not that factual as there is limited objective evidence.
I disagree with ‘helping another is helping you’ being psuedo-religious woo but it’s because we’re talking about semantics. We have to decide what ‘me’ or my ‘self’ or ‘I’ is. I use the neural activity as the definition of this. You seem to use some type philosophical reasoning where you are presuming I use the same definition.
So we should investigate if your self and identity can die from that and if other facts which we don’t embrace emotionally leads to a similar process but for their area. That’s the entire point of my original post.
It doesn’t look like there’s anywhere to go from here. It looks like you are acknowledging that where your positions are strong, they are not novel, and where they are novel, they are not strong. If you enjoy drawing the boundaries of your self in unusual places or emotionally associating your identity with certain ideas, go for it. Just don’t expect anybody else to find those ideas compelling without evidence.
I agree.
These are the steps I did to have identity death: link to steps I also meditated on the 48 min hypnosis track youtube If you are interested in where I got my ideas from and if you want to try it yourself. It’s of course up to you but you have a strong identity and ego issues and I think it will help “you”(and me).
You’ve had people complete these steps and report that the “What will happen after you make the click” section actually happens?
Yeah, it’s also called ‘Enlightenment’ in theological traditions. You can read the testimonies here. MrMind has, for example, read them, but he’s waiting a bit longer to contact these people on Reddit to see if it sticks around. I think the audio can work really well with a good pair of headphones and playing it as FLAC.