it’s like arguing that fairies are coming out of my toaster in the middle of the night. You can’t prove to me that there aren’t fairies in my toaster, but that doesn’t mean you should take me seriously. What I have a problem with is not so much religion or god, but faith. When you say you believe something in your heart and therefore you can act on it, you have completely justified the 9/11 bombers. You have justified Charlie Manson. If it’s true for you, why isn’t it true for them? Why are you different? If you say “I believe there’s an all-powerful force of love in the universe that connects us all, and I have no evidence of that but I believe it in my heart,” then it’s perfectly okay to believe in your heart that Sharon Tate deserves to die. It’s perfectly okay to believe in your heart that you need to fly planes into buildings for Allah.
This quote seems like it’s lumping every process for arriving at beliefs besides reason into one. “If you don’t follow the process I understand and is guaranteed not to produce beliefs like that, then I can’t guarantee you won’t produce beliefs like that!” But there are many such processes besides reason, that could be going on in their “hearts” to produce their beliefs. Because they are all opaque and non-negotiable and not this particular one you trust not to make people murder Sharon Tate, does not mean that they all have the same probability of producing plane-flying-into-building beliefs.
Consider the following made-up quote: “when you say you believe something is acceptable for some reason other than the Bible said so, you have completely justified Stalin’s planned famines. You have justified Pol Pot. If it’s acceptable for for you, why isn’t it acceptable for them? Why are you different? If you say ‘I believe that gays should not be stoned to death and the Bible doesn’t support me but I believe it in my heart’, then it’s perfectly okay to believe in your heart that dissidents should be sent to be worked to death in Siberia. It’s perfectly okay to believe because your secular morality says so that all the intellectuals in your country need to be killed.”
I would respond to it: “Stop lumping all moralities into two classes, your morality, and all others. One of these lumps has lots of variation in it, and sub-lumps which need to be distinguished, because most of them do not actually condone gulags”
And likewise I respond to Penn Jilette’s quote: “Stop lumping all epistemologies into two classes, yours, and the one where people draw beliefs from their ‘hearts’. One of these lumps has lots of variation in it, and sub-lumps which need to be distinguished, because most of them do not actually result in beliefs that drive them to fly planes into buildings.”
The wishful-thinking new-age “all powerful force of love” faith epistemology is actually pretty safe in terms of not driving people to violence who wouldn’t already be inclined to it. That belief wouldn’t make them feel good. Though of course, faith plus ancient texts which condone violence can be more dangerous, though as we know empirically, for some reason, people driven to violence by their religions are rare these days, even coming from religions like that.
I don’t think it’s lumping everything together. It’s criticizing the rule “Act on what you feel in your heart.” That applies to a lot of people’s beliefs, but it certainly isn’t the epistemology of everyone who doesn’t agree with Penn Jillette.
The problem with “Act on what you feel in your heart” is that it’s too generalizable. It proves too much, because of course someone else might feel something different and some of those things might be horrible. But if my epistemology is an appeal to an external source (which I guess in this context would be a religious book but I’m going to use “believe whatever Rameses II believed” because I think that’s funnier), then that doesn’t necessarily have the same problem.
You can criticize my choice of Rameses II, and you probably should. But now my epistemology is based on an external source and not just my feelings. Unless you reduce me to saying I trust Rameses because I Just Feel that he’s trustworthy, this epistemology does not have the same problem as the one criticized in the quote.
All this to say, Jillette is not unfairly lumping things together and there exist types of morality/epistemology that can be wrong without having this argument apply.
The problem with “Act on what you feel in your heart” is that it’s too generalizable. It proves too much, because of course someone else might feel something different and some of those things might be horrible. But if my epistemology is an appeal to an external source (which I guess in this context would be a religious book but I’m going to use “believe whatever Rameses II believed” because I think that’s funnier), then that doesn’t necessarily have the same problem.
‘Act on an external standard’ is just as generalizable—because you can choose just about anything as your standard. You might choose to consistently act like Gandhi, or like Hitler, or like Zeus, or like a certain book suggests, or like my cat Peter who enjoys killing things and scratching cardboard boxes. If the only thing I know about you is that you consistently behave like someone else, but I don’t know like whom, then I can’t actually predict your behavior at all.
The more important question is: if you act on what you feel in your heart, what determines or changes what is in your heart? And if you act on an external standard, what makes you choose or change your standard?
The problem with “Act on what you feel in your heart” is that it’s too generalizable. It proves too much, because of course someone else might feel something different and some of those things might be horrible.
It looks like there’s all this undefined behavior, and demons coming out the nose from the outside because you aren’t looking at the exact details of what’s going on in with their feelings that are choosing the beliefs. Though a C compiler given an undefined construct may cause your program to crash, it will never literally cause demons to come out of your nose, and you could figure this out if you looked at the implementation of the compiler. It’s still deterministic.
As an atheistic meta-ethical ant-realist, my utility function is basically whatever I want it to be. It’s entirely internal. From the outside, from someone who has a system where they follow something external and clearly specified, they could shout “Nasal demons!”, but demons will never come out my nose, and my internal, ever so frighteningly non-negotiable desires are never going to include planned famines. It has reliable internal structure.
The mistake is looking at a particular kind of specification that defines all the behavior, and then looking at a system not covered by that specification, but which is controlled by another specification you haven’t bothered to understand, and saying “Who can possibly say what that system will do?”
Some processors (even x86) have instructions (such as bit rotate) which are useful for significant performance boosts in stuff like cryptography, and yet aren’t accessible from C or C++, and to use it you have to perform hacks like writing the machine code out as bytes, casting its address to a function pointer and calling it. That’s undefined behavior with respect to the C/C++ standard. But it’s perfectly predictable if you know what platform you’re on.
Other people who aren’t meta-ethical anti-realists’ utility functions are not really negotiable either. You can’t really give them a valid argument that will convince them not to do something evil if they happen to be psychopaths. They just have internal desires and things they care about, and they care a lot more about having a morality which sounds logical when argued for than I do.
And if you actually examine what’s going on with the feelings of people with feeling-driven epistemology that makes them believe things, instead of just shouting “Nasal demons! Unspecified behavior! Infinitely beyond the reach of understanding!” you will see that the non-psychopathic ones have mostly-deterministic internal structure to their feelings that prevents them from believing that they should murder Sharon Tate. And psychopaths won’t be made ethical by reasoning with them anyway. I don’t believe the 9/11 hijackers were psychopaths, but that’s the holy book problem I mentioned, and a rare case.
In most cases of undefined C constructs, there isn’t another carefully-tuned structure that’s doing the job of the C standard in making the behavior something you want, so you crash. And faith-epistemology does behave like this (crashing, rather than running hacky cryptographic code that uses the rotate instructions) when it comes to generating beliefs that don’t have obvious consequences to the user. So it would have been a fair criticism to say “You believe something because you believe it in your heart, and you’ve justified not signing your children up for cryonics because you believe in an afterlife,” because (A) they actually do that, (B) it’s a result of them having an epistemology which doesn’t track the truth.
Disclaimer: I’m not signed up for cryonics, though if I had kids, they would be.
my utility function is basically whatever I want it to be.
I very much doubt that. At least with present technology you cannot self-modify to prefer dead babies over live ones; and there’s presumably no technological advance that can make you want to.
my utility function is basically whatever I want it to be.
If utility functions are those constructed by the VNM theorem, your utility function is your wants; it is not something you can have wants about. There is nothing in the machinery of the theorem that allows for a utility function to talk about itself, to have wants about wants. Utility functions and the lotteries that they evaluate belong to different worlds.
Are there theorems about the existence and construction of self-inspecting utility functions?
Though of course, faith plus ancient texts which condone violence can be more dangerous
That means you can actually make people less harmful if you tell them to listen to their hearts instead of listening to ancient texts. The person who’s completely in their head and analyses the ancient text for absolute guidance of action is dangerous.
A lot of religions also have tricks were the believer has to go through painful exercises. Just look at a Christian sect like Opus Dei with cilices. The kind of religious believer who wears a cilice loses touch with his heart.
Getting someone who’s in the habit of causing his own body pain with a cilice to harm other people is easier.
I’d have to disagree here; I think that “faith” is a useful reference class that pretty effectively cleaves reality at the joints, which does in fact lump together the epistemologies Penn Jilette is objecting to.
The fact that some communities of people who have norms which promote taking beliefs on faith do not tend to engage in acts of violence, while some such communities do, does not mean that their epistemologies are particularly distinct. Their specific beliefs might be different, but one group will not have much basis to criticize the grounds of others’ beliefs.
The flaw he’s arguing here is not “faith-based reasoning sometimes drives people to commit acts of violence,” but “faith-based reasoning is unreliable enough that it can justify anything, in practice as well as principle, including acts of extreme violence.”
I’d have to disagree here; I think that “faith” is a useful reference class that pretty effectively cleaves reality at the joints, which does in fact lump together the epistemologies Penn Jilette is objecting to.
People who follow the moral code of the Bible versus peopel that don’t is also a pretty clear criteria that separates some epistemologies from others.
The fact that some communities of people who have norms which promote taking beliefs on faith do not tend to engage in acts of violence, while some such communities do, does not mean that their epistemologies are particularly distinct.
People who uses a pendulum to make decisions as a very different epistemology than someone who thinks about what the authorities in his particular church want him to do and acts accordingly.
“faith-based reasoning is unreliable enough that it can justify anything, in practice as well as principle, including acts of extreme violence.”
The kind of people who win the world debating championship also haave no problem justying policies like genocide with rational arguments that win competive intellectual debates.
Justifying actions is something different than decision criteria.
People who follow the moral code of the Bible versus peopel that don’t is also a pretty clear criteria that separates some epistemologies from others.
Yes, but then you can go a step down from there, and ask “why do you believe in the contents of the bible?” For some individuals, this will actually be a question of evidence; they are prepared to reason about the evidence for and against the truth of the biblical narrative, and reject it given an adequate balance of evidence. They’re generally more biased on the question than they realize, but they are at least convinced that they must have adequate evidence to justify their belief in the biblical narrative.
I have argued people out of their religious belief before (and not just Christianity,) but never someone who thought that it was correct to take factual beliefs that feel right “on faith” without first convincing them that this is incorrect as a general rule, not simply in the specific case of religion. This is an epistemic underpinning which unites people from different religions, whatever tenets or holy books they might ascribe to. I’ve also argued the same point with people who were not religious; it’s not simply a quality of any particular religion, it’s one of the most common memetic defenses in the human arsenal.
Mr. Potter, in the end people all do what they want to do. Sometimes people gives names like ‘right’ to the things they want to do, but how could we possibly act on anything but our own desires?
-- Rational!Quirrel, HPMoR chapter 20
In other words: how else can you justify a moral belief and consequent actions, except by saying that you really truly believe in your heart that you’re Right?
We should not confuse between the fact that almost all people other than Manson think he was morally wrong, and the fact that his justification for his action seems to me to be of the same kind as the justifications anyone else ever gives for their moral beliefs and actions.
Unlike Quirrell, Penn Jillette is not referring to “knowing in your heart” that your moral values are correct, but to “knowing in your heart” some matters of fact (which may then serve as a justification for having some moral values, or directly for some action).
“Deserving” is a moral theorem, not a moral axiom. You can most definitely test and check whether someone deserves something, by asking about the rules of the game and their position within the game.
If there is no game at hand, I would say “deserving” becomes nonsense, but that’s just me.
If you’re a moral realist, and you think moral opinions are statements of fact (which may be right or wrong), then you think it’s possible to “know in your heart” moral “facts”.
If you’re a moral anti-realist (like me), and you think moral opinions are statements of preferences (in other words, statements of fact about your own preferences and your own brain-wiring), then all moral opinions are such. And then surely Manson’s statement of his preferences has the same status as anyone else’s, and the only difference is that most people disagree with Manson.
What else is there?
However, it’s true that Jillette talks about factual amoral beliefs like fairies and gods. So my comment was somewhat misdirected. I still think it’s partly relevant, because people who believe in gods (i.e. most people) usually tie them closely to their moral opinions. It’s impossible to discuss morals (of most humans) without discussing religious beliefs.
You have justified Charlie Manson. If it’s true for you, why isn’t it true for them?
That leaves the question of how Penn actually knows that Chalie Manson was acting based on what his heart was telling him.
Psychopaths are frequently bad at empathy or “listening to their hearts”. It might even be the defining characteristic of what makes someone a psychopath.
You missed the point entirely. ‘Listening to their (own) hearts’ is not empathy, it’s just giving credibility to your instinctive beliefs, regardless of wether they have a basis or not. How is believing that everyone is connected by a network of magical energy tethers and acting according to that any different than believing that my soul will be saved if I massacre 40 people and acting on that?
The only difference is the actual acts that you take due to the beliefs. Mind you, it’s a very important difference, but the quote is not talking about that, it’s talking about beliefs themselves and using them as a sufficient justification for acts.
Listening to their (own) hearts’ is not empathy, it’s just giving credibility to your instinctive beliefs, regardless of wether they have a basis or not.
I think that plenty of people who call themselves rationalists simply have no idea what listening to one’s own heart actually means.
It’s like talking with a blind man who has no concept about how green differs from red about how one using a traffic light, to decide when to stop your car. You mean at on time one lamp shown you that you have to stop and at another time it tells you to go ahead? How do you tell the difference?
How is believing that everyone is connected by a network of magical energy tethers and acting according to that any different than believing that my soul will be saved if I massacre 40 people and acting on that?
You basically left out the part about listening to your heart.
Having a cognitive belief and making decisions based on mental analysis of the consequences of the belief is not what listening to one’s heart is about.
If a human tries to murder another, certain automatic programming fires that dissuades the human from killing. Emotions come up. If you listen to them, you won’t kill.
You actually have to refuse to listen to your heart to be capable of killing. Maybe there are a few Buddhists who manage to be in a complete state of pure heartfelt love while they ram a knife into someone’s else heart but that’s very far from what 99.99% of the population is capable of.
In the military soldiers get trained to disassociate the emotions that prevent them from killing others.
Psychopaths usually do have a bunch of beliefs about morals. What they lack is the ability to listen to their hearts in a way that guides their actions.
The philosophers of ethics steal more books than other philosophers. It’s not clear that well thought out moral beliefs are useful for preventing people from engaging in immoral actions.
The only difference is the actual acts that you take due to the beliefs.
No. Whether or someone is in their head or listens to their heart can matter to the people around him, if those people are perceptive enough to tell the difference. It probably effects most people on an unconscious level.
Listening to your heart just means listening to your innermost desires. It has nothing to do with empathy. Meaning that psychopaths listen to their heart just as much as anyone else. I’ve never heard anyone use the idiom “listen to your heart” to mean to practice empathy.
Listening to your heart just means listening to your innermost desires.
Sexual lust would be a desire that not felt in the heart but elsewhere.
The heart is a specific place in the body. Recently a woman in my meditation group that that she got a perception for the part of her body behind her heart and that part gives different answer and she now experiments with following those answers.
That a very high level of self perception that most people who speak about listening to their heart don’t have. Most people are a bit more vague about what part in their body they are listening to.
There a reason why people lay their hand on the heart when making an oath and they don’t have it on their heads or their belly. It does something on a physiological level.
I’ve never heard anyone use the idiom “listen to your heart” to mean to practice empathy.
People rather use phrases like having a heartfelt connection or connecting with someone’s heart. To do that you usually need a connection to your own heart.
Sexual lust would be a desire that not felt in the heart but elsewhere
You’re taking this English idiom too literally. It reminds me of when I mentioned “killing two birds with one stone” to my Italian born girlfriend and she was horrified. I had to explain to her that one is not literally killing two birds with one stone; your continued literalism of this particular turn of phrase would be like her continuing to insist that I’m using a metaphor in my own native language wrong since I’m not using stones nor are any birds around.
A good portion of the New Age crowd takes the idiom literally. Listening to their heart is something different than listening to their gut. Different place in the body. Different qualia.
Penn Jillette’s problem is that he take something that’s meant literally pretend that it means something different. It’s like talking to the blind man who thinks that the red and green that you are metaphars for apples and trees.
I grant that there are people who just talk the talk and don’t walk the walk who don’t means it literally. People who read to much books. But it’s a strawman to assume that all people are like this.
The heart is a specific place in the body. Recently a woman in my meditation group that that she got a perception for the part of her body behind her heart and that part gives different answer and she now experiments with following those answers.
That a very high level of self perception that most people who speak about listening to their heart don’t have. Most people are a bit more vague about what part in their body they are listening to.
Why should they have any such perception? The literal heart doesn’t provide any answers whatsoever, the “heart” answers are generated in the brain as much as any of the other ones.
the “heart” answers are generated in the brain as much as any of the other ones.
There are plenty of neurons outside the brain, so I don’t know whether that’s true. Regardles, the motor cortex has somewhere a representation of the hard that”s “in the brain”. Given that panthom limbs can hurt it’s probably somewhere in the motor cortex with feedback channels to the actually body location.
Why should they have any such perception?
That’s a complicated question.
I would preface it by saying that language is evolutionary a recent invention. We are not evolved for that purpose. It’s a byproduct. An accident more than a planned thing. A dog doesn’t need to have a verbalized understanding of a situation to decide whether to do A or B.
It devels into the nature of what emotions are. In academia you have plenty of people who are in a practical sense blind when it comes to perceiving what happens in their body. People who declared blindness as virtue.
If a man get’s an erection and his attention goes to that part of his body, it’s evolutionary useful for the men to do things lead to having sex.
If the same man has an empty stomach and the attention goes to perceiving the feeling of an empty stomach, that in turn leads to different actions.
Somewhere along those lines it made “sense” for evolution to develop a system of emotions where emotions are “located” somewhere in the body. Reuse of already existing neural patterns might also play a huge role. Evolution frequently works by reusing parts that already exist and were build for other purposes.
Years ago in an effort to understand the brain I brought a book called Introducing the Mind and Brain by Angus Gallatly who’s a professsor of Cogntive Psychology.
At the beginning when he recaps the history of the mind he writes:
Homer’s vocabulary does not include mental terms such as “think”, “decide”, “believe”, “doubt” or “desire”. The characters in the stories do not decide to do anything. They have no free will.
Where we would refer to thinking or pondering, Homer”s people refer to speaking to or hearing from their own organs: “I told my heart”, or “my heart told me”. Feelings and emotions are also described in this half-strange, half-familiar manner. Feelings are always located in some part of the body, often the midriff. A sharp intake of breath, the palpitating of the heart, or the uttering of cries is a feeling. A feeling is not some inner thing separate from its bodily manifestation.
At the time I first read those words, I also agreed with the strangeness of the idea. Now years later I’m touch with my body well enough to completely understand why it makes sense to speak that way. I’m not anymore blind. Even on a bad day I can tell apart midriff/stomach, heart and head. I also know people with better kinesthetic perception than myself.
When it comes to return hard questions, why do you think that human have beliefs? The concept doesn’t seem straightforward enough that it was around in Homers days. Do you think dogs have them? Doves? Ants? Caenorhabditis elegans?
Bonus question, when do you think that humans started “believing” in beliefs?
Re: Homer’s vocabulary not including mental terms: this is one of the things that Julian Jaynes points to as evidence of his “bicameral mind”. Do you happen to know whether the book you read has any connection to Jaynes’ work?
The book that I read is mostly an introduction into neuroscience that says a bunch of things everyone is supposed to know and illustrates it with pretty pictures. It begins like a lot of textbooks with talking about the history of the subject. It’s not the kind of book who tries to say something new.
Julian Jaynes isn’t referenced. But the book is from a given that Jayne is widely read I think there a good chance that a Cognitive Psychology professor like Gellatly read him.
In general reading on Wikipedia that Jaynes influenced Daniel Dennett is funny when Dannett says things like that consciousness doesn’t exist or is a lie that the brain tells itself.
The thing that Jaynes calls consciousness might be called ‘ego’ by a Buddhist who wants to transcend it to reach a state of higher consciousness.
At the time I first read those words, I also agreed with the strangeness of the idea. Now years later I’m touch with my body well enough to completely understand why it makes sense to speak that way. I’m not anymore blind. Even on a bad day I can tell apart midriff/stomach, heart and head. I also know people with better kinesthetic perception than myself.
I would say that this is probably a result of different emotions being associated with certain physiological responses. The body reacts to what’s going on in the brain, and the brain gets further feedback from that.
I recognize the responses from various parts of my body when I think, but that doesn’t mean that other parts of my body are doing the thinking for me, or that imagining they are would result in my making better decisions.
When it comes to return hard questions, why do you think that human have beliefs? The concept doesn’t seem straightforward enough that it was around in Homers days. Do you think dogs have them? Doves? Ants? Caenorhabditis elegans?
Bonus question, when do you think that humans started “believing” in beliefs?
Could you make clearer what you mean by beliefs, or what it means to “believe” in beliefs? As-is, the questions seem too vague to adequately answer.
Could you make clearer what you mean by beliefs, or what it means to “believe” in beliefs? As-is, the questions seem too vague to adequately answer.
In Homer’s time there was no concept of beliefs. In this discussion there the notion that people who listen to their hearts somehow develop the wrong beliefs and that’s bad.
So whatever Penn Jillette means when he says “believe”. In case you think that’s no coherent concept, that would also be an answer that I would accept.
I recognize the responses from various parts of my body when I think, but that doesn’t mean that other parts of my body are doing the thinking for me, or that imagining they are would result in my making better decisions.
I’m not arguing better or worse. I’m arguing different. People who listen to their hearts don’t go on killing sprees. They won’t push fat men of bridges. If you think that not enough fat men are pushed of bridges than you might argue against “listening to your heart” but there a very different discussion.
If I’m having this discussion on LW I’m mostly in my head. That’s completely appropriate. If I would be mainly in my head while dancing Salsa, that would lead to a lot of bad decisions during Salsa dancing.
Beyond bad decisions, if the girl with whom I’m dancing is perceptive it will feel inapproriate for her.
I’d like to point out that this is not an established fact. This is a theory which has been debated and I don’t think made it to the mainstream status. It is also my impression that the Odyssey is somewhat different from the Iliad in that regard.
This is a theory which has been debated and I don’t think made it to the mainstream status.
The book from which I took it is a mainstream introduction to cognitive science written by a professor of cognitive psychology that published papers. I read it because someone at in my bioinformatics university course recommended it to me as an introduction. What do you mean with “mainstream status” is that doesn’t count as mainstream?
By mainstream status I mean “generally accepted in the field as true”. Lots of professors publish lots of books with claims that are not generally accepted as true. Sometimes this not is “not yet”, sometimes it is “not and never will be because they are wrong”, and sometimes it is “maybe, but the probability looks low and there are better approaches”.
First I haven’t investigated the issue beyond this one book. If you know of a good source arguing the opposite, I’m happy to look up your reference.
Secondly, I don’t think that’s useful to equate mainstream belief, with consensus belief. I think it’s quite useful to have a term for ideas found in mainstream science textbooks compared to ideas that you don’t find in mainstream science textbooks.
Science by it’s nature isn’t certain and science textbooks can contain claims that aren’t true. If I’m discussing a topic like this I think it’s useful to be clear about which ideas from me come from a mainstream science source and which come from other sources such as personal experience or a NLP seminar.
For the purposes of the point that I made it’s also not important whether Homer in particular had a concept of beliefs or whether I find some African tribe who doesn’t have a word for it.
The point is to go back and question core assumptions and getting more clear about the mental concepts that one uses because one doesn’t take them for granted.
Don’t model human cognition in form of beliefs just because your parents told you that humans make decisions according to beliefs. I think that’s a core part of the rationalist project.
At a LW meetup I made a session about emotions and asked at the start what everyone thought that the word meant. Roughly a third said A, a third said B and the last third had no opinion.
If you are not clear what you mean when you say “believe” and make complex arguments that build on the term, you are going to make mistakes and not see them because your terms are muddy and you are making a bunch of assumptions about which you never thought explicitely.
So whatever Penn Jillette means when he says “believe”. In case you think that’s no coherent concept, that would also be an answer that I would accept.
If we’re talking about Penn Jillete’s conception of “beliefs”, then I would say that he probably has in mind pieces of information that our minds can represent and reason about abstractly, although this is of course somewhat speculative as I cannot speak for Penn Jillette. I would say that this probably doesn’t apply to the other species you named, but may apply to some other existing species, and probably some of our ancestors in the Homo genus.
I’m not arguing better or worse. I’m arguing different. People who listen to their hearts don’t go on killing sprees.
I would regard this as a highly extraordinary claim demanding commensurately extraordinary evidence, and I would caution that this is a case which seems very prone to inviting the No True Scotsman fallacy. First off, how would you determine whether an individual listens to their heart or not, and second, how do you know that individuals who listen to their hearts don’t engage in such antisocial behaviors?
I would regard this as a highly extraordinary claim demanding commensurately extraordinary evidence, and I would caution that this is a case which seems very prone to inviting the No True Scotsman fallacy. First off, how would you determine whether an individual listens to their heart or not, and second, how do you know that individuals who listen to their hearts don’t engage in such antisocial behaviors?
There are people who listen to their heads who go on killing sprees. I believe Christian’s claims is that listening to one’s heart is either uncorrelated or negatively correlated with going on killing sprees.
I believe Christian’s claims is that listening to one’s heart is either uncorrelated or negatively correlated with going on killing sprees.
I don’t believe this is the case; I think the continuation of this discussion in other comments has made it pretty clear that he’s arguing that, while listening to their hearts, people do not go on killing sprees at all.
First off, how would you determine whether an individual listens to their heart or not,
At the moment by observing and checking whether specific qualia are there. If I really wanted to make the proof in numbers, that would require that I systematically calibrate my own perception first and determine sensitivity and specificity of my perception of other people.
I’m also still a person who’s fairly intellectual. There are people with better perception than myself and getting them to do the assessing might be better.
Having a way to get that data via a more automated process that doesn’t need a perceptive human would also be nice. At the moment I however have no clear idea about how to go about measuring or the necessary financial resources to finance that kind of research.
how do you know that individuals who listen to their hearts don’t engage in such antisocial behaviors?
A mix of more theoretical thinking and practical observation of the behavior of people with whom I’m interacting changes when the qualia I’m perceiving suggests that the locus of their attention within their body changes.
I would regard this as a highly extraordinary claim demanding commensurately extraordinary evidence,
I understand that’s an advanced claim. At the moment I’m more concerned with making clear what the claim is than proving it.
If I say that Harry is not going to kill people if he listens to Hufflepuff but might kill if he listens to Slytherin, would that be a strange claim for you? If I say people who always listen to Hufflepuff don’t go on killing sprees would that seem strange to you? Most people you know don’t have the ability to mentally commit to 100% listen to Hufflepuff in every decision that they make in their lifes.
If I remember right Eliezer uses those different persona because it’s popular in systematic therapy to do so
and someone he knows taught him that thinking that way can be useful. Those persona have a different quality than organs that can be perceived kinesthetically but they are not that different.
Lastly it’s useful to keep in mind what extraordinary claim needing extraordinary evidence can lead to. If you take it too far it shuts down people from saying what they honestly believe and instead let’s them argue beliefs that they don’t fully stand behind.
We all have many beliefs that come out of personal experience and not from reading papers. There are areas where the personal experiences differs massively. In those cases we don’t get certainity about what’s true when someone else tells us about how he thinks the world works. Simply understanding the models of other people is still be useful because then you might use that model sometime in the future when it explains something you see better than your other mental models.
If I say that Harry is not going to kill people if he listens to Hufflepuff but might kill if he listens to Slytherin, would that be a strange claim for you? If I say people who always listen to Hufflepuff don’t go on killing sprees would that seem strange to you?
No and yes respectively.
Hufflepuff isn’t a natural category, Harry!Hufflepuff is an abstraction based on Harry filtering his personality through certain criteria and impulses, such as what he conceives of as loyalty and compassion. Do I think that Harry, reasoning through his conception of loyalty and compassion, would go on a killing spree? Unlikely. Do I think that there are people who, reasoning through their conceptions of loyalty and compassion, would go on killing sprees? Absolutely.
A neurological fact that may be of some relevance here. Oxytocin, the chemical associated with triggering feelings of love and affection, has also been found to trigger increases in xenophobia and ingroup/outgroup bias.
Feelings of love and loyalty are not anathema to hate and violence. Rather, they often go hand in hand; the same feelings that unite you with a group can also be those which make you feel you’re united against something else.
Lastly it’s useful to keep in mind what extraordinary claim needing extraordinary evidence can lead to. If you take it too far it shuts down people from saying what they honestly believe and instead let’s them argue beliefs that they don’t fully stand behind.
How so? I don’t take any issue with your stating your beliefs and arguing in their favor. As is, I think that they’re misguided, but that’s because I think the weight of evidence is not in their favor. If you convinced me it were, I would change my mind. I think it would be far more useful for you to defend your belief with the best evidence you think favors it than to simply assert your belief.
Hufflepuff isn’t a natural category, Harry!Hufflepuff is an abstraction based on Harry filtering his personality through certain criteria and impulses
Depends on what you mean with natural. Persona’s like that are probably as natural as beliefs are.
Both aren’t hard coded but develop over time. I would guess for most people on LW persona’s like that aren’t in their conscious awareness. That doesn’t mean they don’t influence decision making.
Especially the persona’s that represents the parents often has a strong effect on people decisions in life.
Feelings of love and loyalty are not anathema to hate and violence.
Hate is usually felt in the midriff/gut/belly/stomach area and not where the heart is.
People also don’t just go at random on killing sprees. It’s the result of a longer process. Men often fail at approaching a hot woman because they have emotions that block them from doing so. It’s necessary to process emotions first, before being able to approach a hot woman.
Simply being attracted to the woman isn’t enough to overrule those other process that prevent that behavior.
If it comes to behavior like killing another person, I would assume that the emotional barrier are even stronger.
Only 15 to 20 percent of the American riflemen in combat during World War II would fire at the enemy.
That suggest that you need a lot more than a bit oxytocin for increased ingroup/outgroup bias. I don’t think
that the US army failed at teaching it’s solider the belief that shooting at the enemy makes sense.
The modern solution to how you get soldier to fire at the enemy is to do desensitation training.
I think I observed in the last year two persons who would qualify clinically as psychopathic. Both appeared to me very absent from their own bodies (description of a qualia that I have). Magnitudes more than the other people with whom I interacted in the year.
Let’s say someone get’s dumped by his girlfriend. His heart hurts very much. Enough that he rather doesn’t listen to it to reduce the pain. He blocks out the feeling by disassociating it. The person also feel very angry in is midrif and wants to act out that anger. That person might kill his girlfriend in revenge. There are probably a bunch of other filters he has to overcome.
I think it would be far more useful for you to defend your belief with the best evidence you think favors it than to simply assert your belief.
I’m thinking you underrate the difficulty of communicating what the belief actually is and not expressing it in a way where you will think that I believe something that’s different from what I actually believe. The Jayses example shows how a word like consciousness might be interpreted opposite from how it’s meant.
I’m essentially trying to explain new phenomenological primitives. Telling someone who’s not well educated in physics that a steel ball thown at the ground bounces back because of springiness in a way that you will be understood is not easy. The idea that the steel ball changes like a spring is not easy to accept. Even for students who believe that their physics teacher tells them the truth it takes time for them to accept that idea.
Some of the literature is very pessimistic about the idea of teaching new phenomenological primitives in physics classes instead of reorganising existing ones even if you are a teacher with authority over student and have plenty of time.
Attempting to do the same thing in an online discussion is ambitious.
--Penn Jillette in “Penn Jillette Is Willing to Be a Guest on Adolf Hitler’s Talk Show, Vanity Fair, June 17, 2010
This quote seems like it’s lumping every process for arriving at beliefs besides reason into one. “If you don’t follow the process I understand and is guaranteed not to produce beliefs like that, then I can’t guarantee you won’t produce beliefs like that!” But there are many such processes besides reason, that could be going on in their “hearts” to produce their beliefs. Because they are all opaque and non-negotiable and not this particular one you trust not to make people murder Sharon Tate, does not mean that they all have the same probability of producing plane-flying-into-building beliefs.
Consider the following made-up quote: “when you say you believe something is acceptable for some reason other than the Bible said so, you have completely justified Stalin’s planned famines. You have justified Pol Pot. If it’s acceptable for for you, why isn’t it acceptable for them? Why are you different? If you say ‘I believe that gays should not be stoned to death and the Bible doesn’t support me but I believe it in my heart’, then it’s perfectly okay to believe in your heart that dissidents should be sent to be worked to death in Siberia. It’s perfectly okay to believe because your secular morality says so that all the intellectuals in your country need to be killed.”
I would respond to it: “Stop lumping all moralities into two classes, your morality, and all others. One of these lumps has lots of variation in it, and sub-lumps which need to be distinguished, because most of them do not actually condone gulags”
And likewise I respond to Penn Jilette’s quote: “Stop lumping all epistemologies into two classes, yours, and the one where people draw beliefs from their ‘hearts’. One of these lumps has lots of variation in it, and sub-lumps which need to be distinguished, because most of them do not actually result in beliefs that drive them to fly planes into buildings.”
The wishful-thinking new-age “all powerful force of love” faith epistemology is actually pretty safe in terms of not driving people to violence who wouldn’t already be inclined to it. That belief wouldn’t make them feel good. Though of course, faith plus ancient texts which condone violence can be more dangerous, though as we know empirically, for some reason, people driven to violence by their religions are rare these days, even coming from religions like that.
I don’t think it’s lumping everything together. It’s criticizing the rule “Act on what you feel in your heart.” That applies to a lot of people’s beliefs, but it certainly isn’t the epistemology of everyone who doesn’t agree with Penn Jillette.
The problem with “Act on what you feel in your heart” is that it’s too generalizable. It proves too much, because of course someone else might feel something different and some of those things might be horrible. But if my epistemology is an appeal to an external source (which I guess in this context would be a religious book but I’m going to use “believe whatever Rameses II believed” because I think that’s funnier), then that doesn’t necessarily have the same problem.
You can criticize my choice of Rameses II, and you probably should. But now my epistemology is based on an external source and not just my feelings. Unless you reduce me to saying I trust Rameses because I Just Feel that he’s trustworthy, this epistemology does not have the same problem as the one criticized in the quote.
All this to say, Jillette is not unfairly lumping things together and there exist types of morality/epistemology that can be wrong without having this argument apply.
‘Act on an external standard’ is just as generalizable—because you can choose just about anything as your standard. You might choose to consistently act like Gandhi, or like Hitler, or like Zeus, or like a certain book suggests, or like my cat Peter who enjoys killing things and scratching cardboard boxes. If the only thing I know about you is that you consistently behave like someone else, but I don’t know like whom, then I can’t actually predict your behavior at all.
The more important question is: if you act on what you feel in your heart, what determines or changes what is in your heart? And if you act on an external standard, what makes you choose or change your standard?
It looks like there’s all this undefined behavior, and demons coming out the nose from the outside because you aren’t looking at the exact details of what’s going on in with their feelings that are choosing the beliefs. Though a C compiler given an undefined construct may cause your program to crash, it will never literally cause demons to come out of your nose, and you could figure this out if you looked at the implementation of the compiler. It’s still deterministic.
As an atheistic meta-ethical ant-realist, my utility function is basically whatever I want it to be. It’s entirely internal. From the outside, from someone who has a system where they follow something external and clearly specified, they could shout “Nasal demons!”, but demons will never come out my nose, and my internal, ever so frighteningly non-negotiable desires are never going to include planned famines. It has reliable internal structure.
The mistake is looking at a particular kind of specification that defines all the behavior, and then looking at a system not covered by that specification, but which is controlled by another specification you haven’t bothered to understand, and saying “Who can possibly say what that system will do?”
Some processors (even x86) have instructions (such as bit rotate) which are useful for significant performance boosts in stuff like cryptography, and yet aren’t accessible from C or C++, and to use it you have to perform hacks like writing the machine code out as bytes, casting its address to a function pointer and calling it. That’s undefined behavior with respect to the C/C++ standard. But it’s perfectly predictable if you know what platform you’re on.
Other people who aren’t meta-ethical anti-realists’ utility functions are not really negotiable either. You can’t really give them a valid argument that will convince them not to do something evil if they happen to be psychopaths. They just have internal desires and things they care about, and they care a lot more about having a morality which sounds logical when argued for than I do.
And if you actually examine what’s going on with the feelings of people with feeling-driven epistemology that makes them believe things, instead of just shouting “Nasal demons! Unspecified behavior! Infinitely beyond the reach of understanding!” you will see that the non-psychopathic ones have mostly-deterministic internal structure to their feelings that prevents them from believing that they should murder Sharon Tate. And psychopaths won’t be made ethical by reasoning with them anyway. I don’t believe the 9/11 hijackers were psychopaths, but that’s the holy book problem I mentioned, and a rare case.
In most cases of undefined C constructs, there isn’t another carefully-tuned structure that’s doing the job of the C standard in making the behavior something you want, so you crash. And faith-epistemology does behave like this (crashing, rather than running hacky cryptographic code that uses the rotate instructions) when it comes to generating beliefs that don’t have obvious consequences to the user. So it would have been a fair criticism to say “You believe something because you believe it in your heart, and you’ve justified not signing your children up for cryonics because you believe in an afterlife,” because (A) they actually do that, (B) it’s a result of them having an epistemology which doesn’t track the truth.
Disclaimer: I’m not signed up for cryonics, though if I had kids, they would be.
I very much doubt that. At least with present technology you cannot self-modify to prefer dead babies over live ones; and there’s presumably no technological advance that can make you want to.
If utility functions are those constructed by the VNM theorem, your utility function is your wants; it is not something you can have wants about. There is nothing in the machinery of the theorem that allows for a utility function to talk about itself, to have wants about wants. Utility functions and the lotteries that they evaluate belong to different worlds.
Are there theorems about the existence and construction of self-inspecting utility functions?
That means you can actually make people less harmful if you tell them to listen to their hearts instead of listening to ancient texts. The person who’s completely in their head and analyses the ancient text for absolute guidance of action is dangerous.
A lot of religions also have tricks were the believer has to go through painful exercises. Just look at a Christian sect like Opus Dei with cilices. The kind of religious believer who wears a cilice loses touch with his heart. Getting someone who’s in the habit of causing his own body pain with a cilice to harm other people is easier.
I’d have to disagree here; I think that “faith” is a useful reference class that pretty effectively cleaves reality at the joints, which does in fact lump together the epistemologies Penn Jilette is objecting to.
The fact that some communities of people who have norms which promote taking beliefs on faith do not tend to engage in acts of violence, while some such communities do, does not mean that their epistemologies are particularly distinct. Their specific beliefs might be different, but one group will not have much basis to criticize the grounds of others’ beliefs.
The flaw he’s arguing here is not “faith-based reasoning sometimes drives people to commit acts of violence,” but “faith-based reasoning is unreliable enough that it can justify anything, in practice as well as principle, including acts of extreme violence.”
People who follow the moral code of the Bible versus peopel that don’t is also a pretty clear criteria that separates some epistemologies from others.
People who uses a pendulum to make decisions as a very different epistemology than someone who thinks about what the authorities in his particular church want him to do and acts accordingly.
The kind of people who win the world debating championship also haave no problem justying policies like genocide with rational arguments that win competive intellectual debates.
Justifying actions is something different than decision criteria.
Yes, but then you can go a step down from there, and ask “why do you believe in the contents of the bible?” For some individuals, this will actually be a question of evidence; they are prepared to reason about the evidence for and against the truth of the biblical narrative, and reject it given an adequate balance of evidence. They’re generally more biased on the question than they realize, but they are at least convinced that they must have adequate evidence to justify their belief in the biblical narrative.
I have argued people out of their religious belief before (and not just Christianity,) but never someone who thought that it was correct to take factual beliefs that feel right “on faith” without first convincing them that this is incorrect as a general rule, not simply in the specific case of religion. This is an epistemic underpinning which unites people from different religions, whatever tenets or holy books they might ascribe to. I’ve also argued the same point with people who were not religious; it’s not simply a quality of any particular religion, it’s one of the most common memetic defenses in the human arsenal.
-- Rational!Quirrel, HPMoR chapter 20
In other words: how else can you justify a moral belief and consequent actions, except by saying that you really truly believe in your heart that you’re Right?
We should not confuse between the fact that almost all people other than Manson think he was morally wrong, and the fact that his justification for his action seems to me to be of the same kind as the justifications anyone else ever gives for their moral beliefs and actions.
Unlike Quirrell, Penn Jillette is not referring to “knowing in your heart” that your moral values are correct, but to “knowing in your heart” some matters of fact (which may then serve as a justification for having some moral values, or directly for some action).
In what way is “deserve” a matter of fact?
“Deserving” is a moral theorem, not a moral axiom. You can most definitely test and check whether someone deserves something, by asking about the rules of the game and their position within the game.
If there is no game at hand, I would say “deserving” becomes nonsense, but that’s just me.
If you’re a moral realist, and you think moral opinions are statements of fact (which may be right or wrong), then you think it’s possible to “know in your heart” moral “facts”.
If you’re a moral anti-realist (like me), and you think moral opinions are statements of preferences (in other words, statements of fact about your own preferences and your own brain-wiring), then all moral opinions are such. And then surely Manson’s statement of his preferences has the same status as anyone else’s, and the only difference is that most people disagree with Manson.
What else is there?
However, it’s true that Jillette talks about factual amoral beliefs like fairies and gods. So my comment was somewhat misdirected. I still think it’s partly relevant, because people who believe in gods (i.e. most people) usually tie them closely to their moral opinions. It’s impossible to discuss morals (of most humans) without discussing religious beliefs.
That leaves the question of how Penn actually knows that Chalie Manson was acting based on what his heart was telling him.
Psychopaths are frequently bad at empathy or “listening to their hearts”. It might even be the defining characteristic of what makes someone a psychopath.
You missed the point entirely. ‘Listening to their (own) hearts’ is not empathy, it’s just giving credibility to your instinctive beliefs, regardless of wether they have a basis or not. How is believing that everyone is connected by a network of magical energy tethers and acting according to that any different than believing that my soul will be saved if I massacre 40 people and acting on that?
The only difference is the actual acts that you take due to the beliefs. Mind you, it’s a very important difference, but the quote is not talking about that, it’s talking about beliefs themselves and using them as a sufficient justification for acts.
I think that plenty of people who call themselves rationalists simply have no idea what listening to one’s own heart actually means.
It’s like talking with a blind man who has no concept about how green differs from red about how one using a traffic light, to decide when to stop your car. You mean at on time one lamp shown you that you have to stop and at another time it tells you to go ahead? How do you tell the difference?
You basically left out the part about listening to your heart. Having a cognitive belief and making decisions based on mental analysis of the consequences of the belief is not what listening to one’s heart is about.
If a human tries to murder another, certain automatic programming fires that dissuades the human from killing. Emotions come up. If you listen to them, you won’t kill. You actually have to refuse to listen to your heart to be capable of killing. Maybe there are a few Buddhists who manage to be in a complete state of pure heartfelt love while they ram a knife into someone’s else heart but that’s very far from what 99.99% of the population is capable of.
In the military soldiers get trained to disassociate the emotions that prevent them from killing others. Psychopaths usually do have a bunch of beliefs about morals. What they lack is the ability to listen to their hearts in a way that guides their actions.
The philosophers of ethics steal more books than other philosophers. It’s not clear that well thought out moral beliefs are useful for preventing people from engaging in immoral actions.
No. Whether or someone is in their head or listens to their heart can matter to the people around him, if those people are perceptive enough to tell the difference. It probably effects most people on an unconscious level.
Listening to your heart just means listening to your innermost desires. It has nothing to do with empathy. Meaning that psychopaths listen to their heart just as much as anyone else. I’ve never heard anyone use the idiom “listen to your heart” to mean to practice empathy.
Sexual lust would be a desire that not felt in the heart but elsewhere.
The heart is a specific place in the body. Recently a woman in my meditation group that that she got a perception for the part of her body behind her heart and that part gives different answer and she now experiments with following those answers.
That a very high level of self perception that most people who speak about listening to their heart don’t have. Most people are a bit more vague about what part in their body they are listening to.
There a reason why people lay their hand on the heart when making an oath and they don’t have it on their heads or their belly. It does something on a physiological level.
People rather use phrases like having a heartfelt connection or connecting with someone’s heart. To do that you usually need a connection to your own heart.
You’re taking this English idiom too literally. It reminds me of when I mentioned “killing two birds with one stone” to my Italian born girlfriend and she was horrified. I had to explain to her that one is not literally killing two birds with one stone; your continued literalism of this particular turn of phrase would be like her continuing to insist that I’m using a metaphor in my own native language wrong since I’m not using stones nor are any birds around.
A good portion of the New Age crowd takes the idiom literally. Listening to their heart is something different than listening to their gut. Different place in the body. Different qualia.
Penn Jillette’s problem is that he take something that’s meant literally pretend that it means something different. It’s like talking to the blind man who thinks that the red and green that you are metaphars for apples and trees.
I grant that there are people who just talk the talk and don’t walk the walk who don’t means it literally. People who read to much books. But it’s a strawman to assume that all people are like this.
Why should they have any such perception? The literal heart doesn’t provide any answers whatsoever, the “heart” answers are generated in the brain as much as any of the other ones.
There are plenty of neurons outside the brain, so I don’t know whether that’s true. Regardles, the motor cortex has somewhere a representation of the hard that”s “in the brain”. Given that panthom limbs can hurt it’s probably somewhere in the motor cortex with feedback channels to the actually body location.
That’s a complicated question.
I would preface it by saying that language is evolutionary a recent invention. We are not evolved for that purpose. It’s a byproduct. An accident more than a planned thing. A dog doesn’t need to have a verbalized understanding of a situation to decide whether to do A or B.
It devels into the nature of what emotions are. In academia you have plenty of people who are in a practical sense blind when it comes to perceiving what happens in their body. People who declared blindness as virtue.
If a man get’s an erection and his attention goes to that part of his body, it’s evolutionary useful for the men to do things lead to having sex.
If the same man has an empty stomach and the attention goes to perceiving the feeling of an empty stomach, that in turn leads to different actions.
Somewhere along those lines it made “sense” for evolution to develop a system of emotions where emotions are “located” somewhere in the body. Reuse of already existing neural patterns might also play a huge role. Evolution frequently works by reusing parts that already exist and were build for other purposes.
Years ago in an effort to understand the brain I brought a book called Introducing the Mind and Brain by Angus Gallatly who’s a professsor of Cogntive Psychology.
At the beginning when he recaps the history of the mind he writes:
At the time I first read those words, I also agreed with the strangeness of the idea. Now years later I’m touch with my body well enough to completely understand why it makes sense to speak that way. I’m not anymore blind. Even on a bad day I can tell apart midriff/stomach, heart and head. I also know people with better kinesthetic perception than myself.
When it comes to return hard questions, why do you think that human have beliefs? The concept doesn’t seem straightforward enough that it was around in Homers days. Do you think dogs have them? Doves? Ants? Caenorhabditis elegans?
Bonus question, when do you think that humans started “believing” in beliefs?
Re: Homer’s vocabulary not including mental terms: this is one of the things that Julian Jaynes points to as evidence of his “bicameral mind”. Do you happen to know whether the book you read has any connection to Jaynes’ work?
The book that I read is mostly an introduction into neuroscience that says a bunch of things everyone is supposed to know and illustrates it with pretty pictures. It begins like a lot of textbooks with talking about the history of the subject. It’s not the kind of book who tries to say something new.
Julian Jaynes isn’t referenced. But the book is from a given that Jayne is widely read I think there a good chance that a Cognitive Psychology professor like Gellatly read him.
In general reading on Wikipedia that Jaynes influenced Daniel Dennett is funny when Dannett says things like that consciousness doesn’t exist or is a lie that the brain tells itself. The thing that Jaynes calls consciousness might be called ‘ego’ by a Buddhist who wants to transcend it to reach a state of higher consciousness.
I would say that this is probably a result of different emotions being associated with certain physiological responses. The body reacts to what’s going on in the brain, and the brain gets further feedback from that.
I recognize the responses from various parts of my body when I think, but that doesn’t mean that other parts of my body are doing the thinking for me, or that imagining they are would result in my making better decisions.
Could you make clearer what you mean by beliefs, or what it means to “believe” in beliefs? As-is, the questions seem too vague to adequately answer.
In Homer’s time there was no concept of beliefs. In this discussion there the notion that people who listen to their hearts somehow develop the wrong beliefs and that’s bad.
So whatever Penn Jillette means when he says “believe”. In case you think that’s no coherent concept, that would also be an answer that I would accept.
I’m not arguing better or worse. I’m arguing different. People who listen to their hearts don’t go on killing sprees. They won’t push fat men of bridges. If you think that not enough fat men are pushed of bridges than you might argue against “listening to your heart” but there a very different discussion.
If I’m having this discussion on LW I’m mostly in my head. That’s completely appropriate. If I would be mainly in my head while dancing Salsa, that would lead to a lot of bad decisions during Salsa dancing. Beyond bad decisions, if the girl with whom I’m dancing is perceptive it will feel inapproriate for her.
I’d like to point out that this is not an established fact. This is a theory which has been debated and I don’t think made it to the mainstream status. It is also my impression that the Odyssey is somewhat different from the Iliad in that regard.
The book from which I took it is a mainstream introduction to cognitive science written by a professor of cognitive psychology that published papers. I read it because someone at in my bioinformatics university course recommended it to me as an introduction. What do you mean with “mainstream status” is that doesn’t count as mainstream?
By mainstream status I mean “generally accepted in the field as true”. Lots of professors publish lots of books with claims that are not generally accepted as true. Sometimes this not is “not yet”, sometimes it is “not and never will be because they are wrong”, and sometimes it is “maybe, but the probability looks low and there are better approaches”.
First I haven’t investigated the issue beyond this one book. If you know of a good source arguing the opposite, I’m happy to look up your reference.
Secondly, I don’t think that’s useful to equate mainstream belief, with consensus belief. I think it’s quite useful to have a term for ideas found in mainstream science textbooks compared to ideas that you don’t find in mainstream science textbooks.
Science by it’s nature isn’t certain and science textbooks can contain claims that aren’t true. If I’m discussing a topic like this I think it’s useful to be clear about which ideas from me come from a mainstream science source and which come from other sources such as personal experience or a NLP seminar.
For the purposes of the point that I made it’s also not important whether Homer in particular had a concept of beliefs or whether I find some African tribe who doesn’t have a word for it. The point is to go back and question core assumptions and getting more clear about the mental concepts that one uses because one doesn’t take them for granted.
Don’t model human cognition in form of beliefs just because your parents told you that humans make decisions according to beliefs. I think that’s a core part of the rationalist project.
At a LW meetup I made a session about emotions and asked at the start what everyone thought that the word meant. Roughly a third said A, a third said B and the last third had no opinion.
If you are not clear what you mean when you say “believe” and make complex arguments that build on the term, you are going to make mistakes and not see them because your terms are muddy and you are making a bunch of assumptions about which you never thought explicitely.
Yes, and that professor is a professor of cognitive psychology, not history.
If we’re talking about Penn Jillete’s conception of “beliefs”, then I would say that he probably has in mind pieces of information that our minds can represent and reason about abstractly, although this is of course somewhat speculative as I cannot speak for Penn Jillette. I would say that this probably doesn’t apply to the other species you named, but may apply to some other existing species, and probably some of our ancestors in the Homo genus.
I would regard this as a highly extraordinary claim demanding commensurately extraordinary evidence, and I would caution that this is a case which seems very prone to inviting the No True Scotsman fallacy. First off, how would you determine whether an individual listens to their heart or not, and second, how do you know that individuals who listen to their hearts don’t engage in such antisocial behaviors?
There are people who listen to their heads who go on killing sprees. I believe Christian’s claims is that listening to one’s heart is either uncorrelated or negatively correlated with going on killing sprees.
I don’t believe this is the case; I think the continuation of this discussion in other comments has made it pretty clear that he’s arguing that, while listening to their hearts, people do not go on killing sprees at all.
At the moment by observing and checking whether specific qualia are there. If I really wanted to make the proof in numbers, that would require that I systematically calibrate my own perception first and determine sensitivity and specificity of my perception of other people.
I’m also still a person who’s fairly intellectual. There are people with better perception than myself and getting them to do the assessing might be better.
Having a way to get that data via a more automated process that doesn’t need a perceptive human would also be nice. At the moment I however have no clear idea about how to go about measuring or the necessary financial resources to finance that kind of research.
A mix of more theoretical thinking and practical observation of the behavior of people with whom I’m interacting changes when the qualia I’m perceiving suggests that the locus of their attention within their body changes.
I understand that’s an advanced claim. At the moment I’m more concerned with making clear what the claim is than proving it.
If I say that Harry is not going to kill people if he listens to Hufflepuff but might kill if he listens to Slytherin, would that be a strange claim for you? If I say people who always listen to Hufflepuff don’t go on killing sprees would that seem strange to you? Most people you know don’t have the ability to mentally commit to 100% listen to Hufflepuff in every decision that they make in their lifes.
If I remember right Eliezer uses those different persona because it’s popular in systematic therapy to do so and someone he knows taught him that thinking that way can be useful. Those persona have a different quality than organs that can be perceived kinesthetically but they are not that different.
Lastly it’s useful to keep in mind what extraordinary claim needing extraordinary evidence can lead to. If you take it too far it shuts down people from saying what they honestly believe and instead let’s them argue beliefs that they don’t fully stand behind.
We all have many beliefs that come out of personal experience and not from reading papers. There are areas where the personal experiences differs massively. In those cases we don’t get certainity about what’s true when someone else tells us about how he thinks the world works. Simply understanding the models of other people is still be useful because then you might use that model sometime in the future when it explains something you see better than your other mental models.
No and yes respectively.
Hufflepuff isn’t a natural category, Harry!Hufflepuff is an abstraction based on Harry filtering his personality through certain criteria and impulses, such as what he conceives of as loyalty and compassion. Do I think that Harry, reasoning through his conception of loyalty and compassion, would go on a killing spree? Unlikely. Do I think that there are people who, reasoning through their conceptions of loyalty and compassion, would go on killing sprees? Absolutely.
A neurological fact that may be of some relevance here. Oxytocin, the chemical associated with triggering feelings of love and affection, has also been found to trigger increases in xenophobia and ingroup/outgroup bias.
Feelings of love and loyalty are not anathema to hate and violence. Rather, they often go hand in hand; the same feelings that unite you with a group can also be those which make you feel you’re united against something else.
How so? I don’t take any issue with your stating your beliefs and arguing in their favor. As is, I think that they’re misguided, but that’s because I think the weight of evidence is not in their favor. If you convinced me it were, I would change my mind. I think it would be far more useful for you to defend your belief with the best evidence you think favors it than to simply assert your belief.
Depends on what you mean with natural. Persona’s like that are probably as natural as beliefs are. Both aren’t hard coded but develop over time. I would guess for most people on LW persona’s like that aren’t in their conscious awareness. That doesn’t mean they don’t influence decision making.
Especially the persona’s that represents the parents often has a strong effect on people decisions in life.
Hate is usually felt in the midriff/gut/belly/stomach area and not where the heart is.
People also don’t just go at random on killing sprees. It’s the result of a longer process. Men often fail at approaching a hot woman because they have emotions that block them from doing so. It’s necessary to process emotions first, before being able to approach a hot woman.
Simply being attracted to the woman isn’t enough to overrule those other process that prevent that behavior. If it comes to behavior like killing another person, I would assume that the emotional barrier are even stronger.
To seek an example about WWII:
That suggest that you need a lot more than a bit oxytocin for increased ingroup/outgroup bias. I don’t think that the US army failed at teaching it’s solider the belief that shooting at the enemy makes sense.
The modern solution to how you get soldier to fire at the enemy is to do desensitation training.
I think I observed in the last year two persons who would qualify clinically as psychopathic. Both appeared to me very absent from their own bodies (description of a qualia that I have). Magnitudes more than the other people with whom I interacted in the year.
Let’s say someone get’s dumped by his girlfriend. His heart hurts very much. Enough that he rather doesn’t listen to it to reduce the pain. He blocks out the feeling by disassociating it. The person also feel very angry in is midrif and wants to act out that anger. That person might kill his girlfriend in revenge. There are probably a bunch of other filters he has to overcome.
I’m thinking you underrate the difficulty of communicating what the belief actually is and not expressing it in a way where you will think that I believe something that’s different from what I actually believe. The Jayses example shows how a word like consciousness might be interpreted opposite from how it’s meant.
I’m essentially trying to explain new phenomenological primitives. Telling someone who’s not well educated in physics that a steel ball thown at the ground bounces back because of springiness in a way that you will be understood is not easy. The idea that the steel ball changes like a spring is not easy to accept. Even for students who believe that their physics teacher tells them the truth it takes time for them to accept that idea.
Some of the literature is very pessimistic about the idea of teaching new phenomenological primitives in physics classes instead of reorganising existing ones even if you are a teacher with authority over student and have plenty of time.
Attempting to do the same thing in an online discussion is ambitious.