Could you explain your last paragraph a little more?
HungryTurtle
Fair enough, could you tell me what exactly it means to be a good rationalist?
Ok, so these skill sets contribute significantly to the productivity and health of a person. Then would you disagree with the following:
Social and emotional skills signifcantly contribute to health and productivity.
Any job, skill, hobby, or task that is human driven can benefit from an increase in the acting agents health and productivity
Therefore social and emotional skills are relevant (to some degree) to all other human driven skill sets
The feeling that I am jumping on nebu and the idea that I am advocating a straw vulcan is you using loaded words to make an extreme judement about my meaning and my motives. First of all, I am not trying to say a rational person has to be emotionless. The fact taht Emotions are important, doesn’t mean that anyone invoking some emotional response is unconditionally right. Supporting something “not because you agree with it” but because you felt some personal attachment is the most common of pyschological reflexes. I am not telling Nebu that he has to be emotionless, or that rationality segregates itself from emtions, but that the way he is using his emotions here is irrational. If you support something how is it something you do not agree with, and why are you supporting something you do not agree with?
when the sentence would work as well with “thought”, is rude as well.
Changing felt for thought is sneaking in connotations
- 23 Apr 2012 13:59 UTC; 0 points) 's comment on Defense Against The Dark Arts: Case Study #1 by (
Additionally, saying that the East should look to the West for enlightenment doesn’t mean there is no enlightenment to be found in the East. It just says that by far the more important enlightenment is more common in the West than the East.
Actually saying that the East should look to the West for enlightenment says nothing about where enlightenment is more or less common, or anything about a degree of enlightenment. This is the assumption you are bringing to the statement. All this statement implies are there are things that the East could learn from the West, with no implication about how many things there are, or as was pointed out above, how many things there are in the East for the West to learn about.
I voted the comment up, but not because I “agreed” with it[1], nor because I wanted to “shut up hippies”, but merely because I found it interesting and felt it earned my endorsement as a comment worth reading.
What the point is of attempting to adhere to or advocate for rationality as a human standard if the axiom of your decision to support something is not that it had any real merit, but that you “felt” it was good?
Ok, then the next question is that would you agree for a human skills related to emotional and social connection maximize the productivity and health of a person?
Isn’t saying that Yvain’s final statement is
exactly backwards
also failing to make a distinction between a vaguely hostile comment and an extreme claim? To say it is exactly backwards is to imply that there is nothing wrong with steve jobs statement. I agree with you that some of Yvain’s fallacies are distorted—most notably the assumption that those who liked the comment were venting out a subconscious lash at “hippies”—but that does not change the fact that Steve job’s statement contains huge logical issues.
First, Yvain is right that it is a fallacy of equivocation.
Second, any statement that attempts to make a generalization about “the East” is a HUGE over-generalization and quite frankly Orientalism. I mean how does Steve jobs justify making an assertion about Russia, China, Japan, Korea, Vietnam and the score of other countries that is associated with the term “the east” from one trip to India in his youth? On what grounds do we take Steve jobs one trip to who knows where in India for how long as representative of the functional value of the civilization as a whole.
Steve jobs is using an availability heuristic which is NOT rational.
There is sufficient evidence that the steve jobs quote and the second quote are not “exactly backwards” as you put it, so why did you put it that way? In my opinion, it suggests that Yvain hit it on the mark. Steven Jobs or something else contained in that quote carries personal connotations that you felt a need to defend.
But I actually can’t agree with your argument than “enlightenment” is a fallacy of equivocation. It IS the Enlightenment values of Bacon and Newton that brought us the enlightenment of vaccination and electricity—that’s not a coincidence.
I think there is some confusion in Yvain’s definition of the third type of enlightenment, and that is why you are missing the point. Yvain describes the third type of enlightenment as
“enlightenment”, meaning achieving a state of nirvana free from worldly desire.
It would be better to think about nirvana as an alternative mental state produced through a highly focused and intentional lifestyle. In this sense it is a technique for internal transformation of the individual psyche. I run every day to get blood flowing to my brain, and mediate in the evening to lower my blood pressure, clam myself, and sharpen my focus. I am not saying I am an expert on buddhism, hinduism, janism, or that I am in a state of nirvana. What I am saying is that there are techniques for internal transformation and techniques for external transformation. What Yvain is saying is that to compare enlightenment techniques, which focus on how best to organize and implement a person for external transformation; and indian religious practices which focus on how best to implement a person for internal transformation is a false comparison. It is like trying to compare a refrigerator and an air conditioner. What defines a good refrigerator does not necessarily define a good air conditioner; what defines a good technique of external transformation does not necessarily define a good technique for internal transformation.
You say
t IS the Enlightenment values of Bacon and Newton that brought us the enlightenment of vaccination and electricity—that’s not a coincidence.
Yvain is not saying it is a coincidence. What he is saying is that vaccination and electricity are not the intended transformations of hinduism or buddhism. A proper equivalent would be to compare how the Western enlightenment values and techniques have benefited concentration, anger management, patience, lowering blood pressure, these type of things. Which I would argue are in increasing shortage in our society.
I’m trying to find a LW essay, i can’t remember what it is called, but it is about maximizing your effort in areas of highest return. For example, if you are a baseball player, you might be around 80% in terms of pitching and 20% in terms of base running. to go from 80% up in pitching becomes exponentially harder; whereas learning the basic skill set to jump from dismal to average base running is not.
Basically, rather than continuing to grasp at perfection in one skill set, it is more efficient to maximize basic levels in a variety of skill sets related to target field. Do you know the essay i am talking about?
That depends, of course, on what the society values. If I value oppressing people, making me more efficient just lets me oppress people more efficiently. If I value war, making me more efficient means I conduct war more efficiently.
So does rationality determine what a person or group values, or is it merely a tool to be used towards subjective values?
Sure. But that scenario implies that wanting to kill ourselves is the goal we’re striving for, and I consider that unlikely enough to not be worth worrying about much.
My scenario does not assume that all of humanity views themselves as one in-group. Whereas what you are saying assumes that it does. Killing ourselves and killing them are two very different things. I don’t think many groups have the goal of killing themselves, but do you not think that the eradication of competing out groups could be seen as increasing in-group survival?
Almost entirely orthogonal.
You are going to have to explain what you mean here.
I don’t think we have a way of slowing technological progress that a) affects all actors (it wouldn’t be a better world if only those nations not obeying international law were making technological progress), and b) has no negative ideological effects.
By “negative ideological effects” do you mean the legitimization of some body of religious knowledge? As stated in my post to Dave, if your objective is to re-condition society to have a rational majority, I can see how religious knowledge (which is often narratively rather than logically sequenced) would be seen as having “negative ideological effects. However, I would argue that there are functional benefits of religion. One of which is the limitation of power. Historically technological progress has for millennia been slowed down by religious and moral barriers. One of the main effects of the scientific revolution was to dissolve these barriers that impeded the production of power (See Mannheim, Ideology and Utopia). However, the current constitution of American society still contains tools of limitation, even non-religious ones. People don’t often look at it this way, but taxation is used in an incredibly moral way. Governments tax highly what they want to dissuade and provide exemptions, even subsidies for what they want to promote. The fact that there is a higher tax on cigarettes is a type of morally based restriction on the expansion of the tobacco industry in our society.
Stronger than taxation there is ability to flat out illegalize something or stigmatize it. Compared to the state of marijuana as an illegal substance and the stigma it carries in many communities makes the limitation of the cigarettes industry through taxation seems relatively minor.
Whether social stigma, taxation, or illegalization, there are several tools at our nation’s disposal to alter the development of industries due to subjective moral values, next to none of which are aimed at limiting the information-technology industries. There is no tax on certain types of research based on a judgment of what is right or wrong. To the contrary, the vast majority of scientific research is for the development of weapons technologies. And who are the primary funders of this research? The department of homeland security and the U.S military make up somewhere around 65-80% of academic research (this statistic might be a little off).
In regards to non-academic research, one of the primary impetuses may not be militarization, but is without doubt entrepreneurialism. Where the primary focus of a person or group is the development of capital the purpose of innovation becomes not fulfilling some need, but to create needs to fulfill the endless goal of cultivating more wealth. Jean Baudrillard is a very interesting sociologist, whose work is built around the idea that in western society no longer do the desires (demands) of people lead to the production of a supply, but rather where desires (demands) are artificially produced by capitalists to fulfill their supplies. A large part of this production is symbolic,, and ultimately distorts the motivations and actions of people to contradict the territories they live in.
Honestly, I would moderate society with more positive religious elements. In my opinion modern society has preserved many dysfunctional elements of religion while abandoning the functional benefits. I can see that a community of rationalists would have a problem with this perspective, seeing that religion almost always results in an undereducated majority being enchanted by their psychological reflexes; but personally, I don’t see the existence of an irrational mass as unconditionally detrimental.
It is interesting to speculate about the potential of a majorly rational society, but I see no practical method of accomplishing this, nor a reason to believe that, I see no real reason to believe that if there was such a configuration would necessarily be superior to the current model.
Either swimmer or Dave, are either of you aware of a practical methodology for rationalizing the masses, or a reason to think why a more efficient society would be any less oppressive or war driven. In fact, in a worst case scenario, I see a world of majorly rational people as transforming into an even more efficient war machine, and killing us all faster. As for the project of pursuit of Friendly AI, I do not know that much about it. What is the perceived end goal of friendly Ai? Is it that an unbiased, unfailing intelligence replaces humans as the primary organizers and arbiters of power in our society, or is it that humanity itself is digitized? I would be very interested to know…without being told to read an entire tome of LW essays.
What would you say if I said caring about my goals in addition to their own goals would make them a better soccer player?
Thanks for the link. I’ll respond back when I get a chance to read it.
Could you show me where he argues this?
Definitely barking up the wrong tree there.
I am asking for Eliezer to apply the technique described in this essay to his own belief system. I don’t see how that could be barking up the wrong tree, unless you are implying that he is some how impervious to “spontaneously self-attack[ing] strong points with comforting replies to rehearse, then to spontaneously self-attack the weakest, most vulnerable points.”
I would like to ask if you have turned this idea against your own most cherished beliefs?
I would be really interested to hear what you see when you “close your eyes, empty your mind, grit your teeth, and deliberately think about whatever hurts” rationality and singularity the most.
If you would like to know what someone who partially disagrees with you would say:
In my opinion, the objective of being a rationalist contains the same lopsided view of technology’s capacity to transform reality that you attribute to God in the Jewish tradition.
According to Jewish theology, God continually sustains the universe and chooses every event in it; but ordinarily, drawing logical implications from this belief is reserved for happier occasions. By saying “God did it!” only when you’ve been blessed with a baby girl, and just-not-thinking “God did it!” for miscarriages and stillbirths and crib deaths, you can build up quite a lopsided picture of your God’s benevolent personality.
Technology cures diseases, provides a more materially comfortable life style for many people, and feeds over 7 billion. By saying “rapid innovation did it” when blessed with a baby girl who would have died in birth without modern medical equipment, and just-not-thinking “rapid implementation of innovation did it” for ecocide, the proliferation of nuclear waste, the destruction of the ocean, increase in cancer, and the ability to wipe out an entire city thousands of miles away, you can build up quite a lopsided picture of technological development’s beneficial personality.
The unquestioned rightness of rapid, continual technological innovation that disregards any negative results as potential signs for the need of moderation is what I see as the weakest point of your beliefs. Or at least my understanding of them.
- 13 Apr 2012 15:15 UTC; 2 points) 's comment on Avoiding Your Belief’s Real Weak Points by (
The idea of using your time and various other resources carefully and efficiently is a good virtue of rationality. Framing it as being irrational is inaccurate and kinda incendiary.
Here is my reasoning for choosing this title. If you don’t mind could you read it and tell me where you think I am mistaken.
I realize that saying ‘rationally irrational’ appears to be a contradiction. However, the idea is talking about the use of rational methodology at two different levels of analysis. Rationality at the level of goal prioritization potentially results in the adoption of an irrational methodology at the level of goal achievement.
L1- Goal Prioritization L2- Goal Achievement
L1 rationality can result in a limitation of L2 rationality within low priority goal context. Let’s say that someone was watching me play a game of soccer (since I have been using the soccer analogy). As they watched, they might critique the fact that my strategy was poorly chosen, and the overall effort exerted by me and my teammates was lackluster. To this observer, who considers themselves a soccer expert, it would be clear that my and my team’s performance was subpar. The observer took notes of all are flaws and inefficient habits, then after the game wrote them all up to present to us. Upon telling me all these insightful f critiques, the observer is shocked to hear that I am grateful for his effort, but am not going to change how I or my team plays soccer. He tries to convince me that I am playing wrong, that we will never win the way I am playing. And he is correct. To any knowledgeable observer I was poorly, even irrationally, playing the game of soccer. Without knowledge of L1 (which is not observable) the execution of L2 (which is observable) cannot be deemed rational or irrational, and in my opinion, will appear irrational in many situations.
Would you say that to you it appears irrational that I have chosen to label this idea as ‘rationally irrational?’ If that is correct. I would suggest that I have some L1 that you are unaware of, and that while my labeling is irrational in regard to L2 (receiving high karma points / recognition in publishing my essay on your blog) that I have de-prioritized this L2 for the sake of my L1. What do you think?
Ok, so then I would say that the soccer player in being empathetic to my objectives would be strengthening his or her emotional/ social capacity, which would benefit his or her health/ productivity, and thus benefit his or her soccer playing.