Once you tune your radio in, you may find such occasions more exciting.
For me, understanding “what’s really going on” in typical social interactions made them even less interesting than when I didn’t. At least back then it was a big mystery to be solved. Now I just think, what a big waste of brain cells.
Roko, do you personally find these status and alliance games interesting? Why? I mean, if you play them really well, you’ll end up with lots of allies and high status among your friends and acquaintances, but what does that matter in the larger scheme of things? And what do you think of the idea that allies and status were much more important in our EEA (i.e., tribal societies) than today, and as a result we are biased to overestimate their importance?
but what does that matter in the larger scheme of things?
Well, that depends upon your axiology.
If you are concerned with existential risk, then it is worth noting that the movement has an undersupply of “people people”, a big gender imbalance and an undersupply of money. (I think that ability to make money is determined, to some extent, by your ability at these social games)
You may feel that status within a social group is an end in itself.
If you are concerned with academic learning, discovering new mathematics/philosophy, then getting better at these social games is probably not so important.
I mean, if you play them really well, you’ll end up with lots of allies and high status among your friends and acquaintances, but what does that matter in the larger scheme of things? And what do you think of the idea that allies and status were much more important in our EEA (i.e., tribal societies) than today, and as a result we are biased to overestimate their importance?
Their importance is a function of our values, which came from the EEA and are not so easily changed. Those values, like wanting friendship, community, relationships, and respect, are a part of what make us human.
I actually don’t interpret social interactions as “status and alliance games,” which is kind of cynical and seems to miss the point. Instead, I try to recognize that people have certain emotional requirements that need to be met in order to gain their trust, friendship, and attraction, and that typical social interactions are about building that type of trust and connection.
Most of what we call values seem to respond to arguments, so they’re not really the kind of fixed values that a utility maximizer would have. I would be wary about calling some cognitive feature “values that came from the EEA and are not easily changed”. Given the right argument or insight, they probably can be changed.
So, granted that it’s human to want friendship, community, etc., I’m still curious whether it’s also human to care less about these things after realizing that they boil down to status and alliance games, and that the outcomes of these games don’t count for much in the larger scheme of things.
So, granted that it’s human to want friendship, community, etc., I’m still curious whether it’s also human to care less about these things after realizing that they boil down to status and alliance games, and that the outcomes of these games don’t count for much in the larger scheme of things.
Well, is it also human to stop desiring tasty food once you realize that it boils down to super-stimulation of hardware that evolved as a device for impromptu chemical analysis to sort out nutritionally adequate stuff from the rest?
As for the “larger scheme of things,” that’s one of those emotionally-appealing sweeping arguments that can be applied to literally anything to make it seem pointless and unworthy of effort. Selectively applying it is a common human bias. (In fact, I’d say it’s a powerful general technique for producing biased argumentation.)
Well, is it also human to stop desiring tasty food once you realize that it boils down to super-stimulation of hardware that evolved as a device for impromptu chemical analysis to sort out nutritionally adequate stuff from the rest?
Not to stop desiring it entirely, but to care less about it than if I didn’t realize, yes. (I only have a sample size of one here, namely myself, so I’m curious if others have the same experience.)
As for the “larger scheme of things,” that’s one of those emotionally-appealing sweeping arguments that can be applied to literally anything to make it seem pointless and unworthy of effort. Selectively applying it is a common human bias. (In fact, I’d say it’s a powerful general technique for producing biased argumentation.)
I don’t think I’m applying it selectively… we’re human and we can only talk about one thing at a time, but other than that I think I do realize that this is a general argument that can be applied to all of our values. It doesn’t seem to affect all of them equally though. Some values, such as wanting to be immortal, and wanting to understand the nature of reality, consciousness, etc., seem to survive the argument much better than others. :)
I think I do realize that this is a general argument that can be applied to all of our values. It doesn’t seem to affect all of them equally though. Some values, such as wanting to be immortal, and wanting to understand the nature of reality, consciousness, etc., seem to survive the argument much better than others. :)
Honestly, I don’t see what you’re basing that conclusion on. What, according to you, determines which human values survive that argument and which not?
Honestly, I don’t see what you’re basing that conclusion on.
I’m surprised that you find the conclusion surprising or controversial. (The conclusion being that some some values survive the “larger scheme of things” argument much better than others.) I know that you wrote earlier:
As for the “larger scheme of things,” that’s one of those emotionally-appealing sweeping arguments that can be applied to literally anything to make it seem pointless and unworthy of effort.
but I didn’t think those words reflected your actual beliefs (I thought you just weren’t paying enough attention to what you were writing). Do you really think that people like me, who do not think that literally everything is pointless and unworthy of effort, have just avoided applying the argument to some of our values?
What, according to you, determines which human values survive that argument and which not?
It seems obvious to me that some values (e.g., avoiding great pain) survive the argument by being hardwired to not respond to any arguments, while others (saving humanity so we can develop an intergalactic civilization, or being the first person in an eventually intergalactic civilization to really understand how decisions are supposed to be made) are grand enough that “larger scheme of things” just don’t apply. (I’m not totally sure I’m interpreting your question correctly, so let me know if that doesn’t answer it.)
Do you really think that people like me, who do not think that literally everything is pointless and unworthy of effort, have just avoided applying the argument to some of our values?
As the only logical possibilities, it’s either that, or you have thought about it and concluded that the argument is not applicable to some values. I don’t find the reasons for this conclusion obvious, and I do see many selective applications of this argument as a common bias in practice, which is why I asked.
It seems obvious to me that some values (e.g., avoiding great pain) survive the argument by being hardwired to not respond to any arguments, while others (saving humanity so we can develop an intergalactic civilization, or being the first person in an eventually intergalactic civilization to really understand how decisions are supposed to be made) are grand enough that “larger scheme of things” just don’t apply. (I’m not totally sure I’m interpreting your question correctly, so let me know if that doesn’t answer it.)
Yes, that answers my question, thanks. I do have disagreements with your conclusion, but I grant that you are not committing the above mentioned fallacy outright.
In particular, my objections are that: (1) for many people, social isolation and lack of status is in fact a hardwired source of great pain (though this may not apply to you, so there is no disagreement here if you’re not making claims about other people), (2) I find the future large-scale developments you speculate about highly unlikely, even assuming technology won’t be the limiting factor, and finally (3) even an intergalactic civilization will matter nothing in the “larger scheme of things” assuming the eventual heat death of the universe. But each of these, except perhaps (1), would be a complex topic for a whole another discussion, so I think we can leave our disagreements rest at this point now that we’ve clarified them.
Agreed: this is an instance of the Godshatter concept
Instead, I try to recognize that people have certain emotional requirements that need to be met in order to gain their trust, friendship, and attraction, and that typical social interactions are about building that type of trust and connection.
What makes the desire to obtain high status within some small group a legitimate piece of Godshatter (good), as opposed to a kind of scope insensitivity (bad)? Or to put it another way, why isn’t scope insensitivity (the non-linear way that a typical human being values other people’s suffering) also considered Godshatter?
Voted up as this is an important and general question about converting human intuitions into a formal utility function.
What makes the desire to obtain high status within some small group a legitimate piece of Godshatter (good), as opposed to a kind of scope insensitivity (bad)?
Do we have a general criterion for deciding these things? Or is it still unresolved in general?
In this specific case, it seems to me that there are many aspects of social interaction that are zero-sum or even negative sum. For the purpose of Coherent Extrapolated Volition, zero sum or negative sum elements are like scope insensitivity, i.e. bad.
There are clearly some social status games that are positive sum.
Do we have a general criterion for deciding these things? Or is it still unresolved in general?
I think it’s unresolved in general. I brought up scope insensitivity as a counter-example to the “Godshatter” argument, or at least a strong form of it which says we should keep all of the values that evolution has handed down to us. It seems likely that we shouldn’t, but exactly where to draw the line is unclear to me. Still, to me, desire for high status in some small group seems to be the same kind of “crazy” value as scope insensitivity.
In this specific case, it seems to me that there are many aspects of social interaction that are zero-sum or even negative sum. For the purpose of Coherent Extrapolated Volition, zero sum or negative sum elements are like scope insensitivity, i.e. bad.
I wasn’t talking about CEV, I was mainly talking about what you or I should value, now, as individuals. I’m not sure that positive-sum/zero-sum has much to do with that.
Deciding which psychological drives to keep, and which to abandon, is the same as figuring out full formal preference (assuming you have more expressive power than just keeping/abandoning), so there is no heuristic for doing that simpler than full formal preference. This problem isn’t just unresolved, it’s almost FAI-complete (preference theory, as opposed to efficient implementation).
Status should be about gaining allies and mates, correct? Just as charity is about helping people.
Gaining more allies and mates (especially for a male) should be better than gaining fewer agreed? If so, why do maths professors spend so much time and effort trying to gain status in the small world of maths? They would be better off appealing to the lowest common denominator and using their intellect to wow people in something more accessible.
Now I just think, what a big waste of brain cells.
However, that’s not how human brains work. It’s not like someone who on an average day spends, say, eight hours doing intellectual work and four hours socializing could do 50% more useful intellectual work by spending 12 hours working instead of socializing. For the overwhelming majority of people, it’s impossible to employ their brains productively for more than a few hours a day. You get tired and lose focus to the point where you’re just making a mess instead of progress.
Similarly, if you develop skills independent of your main intellectual pursuits, it’s not like they will automatically steal resources and make you less productive. Human brain just doesn’t work that way. On the contrary, a suitable schedule of entertaining diversions can increase your productivity in your main pursuit.
Of course, there are exceptions. Some people really can spend nearly all their waking hours intensely focused and fully productive, without the need or want for anything more in their lives. However, this is a very small minority, even among people working in math, hard science, and technical professions.
And what do you think of the idea that allies and status were much more important in our EEA (i.e., tribal societies) than today, and as a result we are biased to overestimate their importance?
That argument can be used to deny the importance of absolutely everything you do. Unless you believe that some part of you came into existence supernaturally, or you’re carrying some highly consequential recent mutation, absolutely everything in your thoughts and deeds is a result of some impulse that evolved in the EEA (although of course it might be manifesting itself in a way very different from the original in today’s environment).
absolutely everything in your thoughts and deeds is a result of some impulse that evolved in the EEA
-- unless you’re using “impulse” in a very broad sense. Plenty of thoughts and deeds (even in the System 1) are the result of your brain’s inputs in your life so far.
For me, understanding “what’s really going on” in typical social interactions made them even less interesting than when I didn’t.
Merely “tuning in” to a social interaction isn’t enough. Subtextual conversations are often tedious if they’re not about you. You have to inject your ego into the conversation for things to get interesting.
do you personally find these status and alliance games interesting? Why?
They’re way more interesting than video games, for example. Or watching television. Or numerous other activities people find fun and engaging. Of course, if you’re bad at them you aren’t going to enjoy them; the same goes for people who can’t get past the first stage of Pac-Man.
Video games have a lot of diversity to them and different genres engage very different skills. Small talk all seems to encompass the same stuff, namely social ranking.
Some of us know how to do it but just don’t -care-, and that doesn’t mean we’re in fact bad at it. I think that is the point this comment thread is going for.
Be careful when you notice more diversity in subject matter you’re a fan of than in subject matter that you’re not. I’m not sure if there’s a name for this bias, but there should be.
When you do that sort of thing to people, it’s called stereotyping of the group you don’t like. I don’t know of a word for noticing distinctions in the thing or people you do like.
There’s also the fact that video games … have a freaking rule book, which tells you things that aren’t complete fabrications designed to make you fail the game if you’re stupid enough to follow them.
I thought for a bit that it would be interesting to have, say, a WWI game where the tutorial teaches you nineteenth-century tactics and then lets you start the game by throwing massed troops against barbed wire, machineguns, and twentieth-century artillery. The slaughter would be epic.
This is something that’s been discussed a few times on LW, but I don’t think it’s accurate. I don’t think there are two sets of rules, a “real” one and a “fake” one. Rather, I think that the rules for social interaction are very complicated and have a lot of exceptions, and any attempt to discuss it will inevitably be oversimplified. Temple Grandin’s book discusses this idea: all social rules have exceptions that can’t be spelled out in full.
The status test (actually a social skills test) isn’t to see if you fail by being stupid enough to follow the “fake” rules rather than the “real ones”. It’s to see if you’re savvy enough to understand all the nuances and exceptions to the rules.
...with video games, the printed, widely available strategy guides often tend to be lacking. For adventure games or Final Fantasy-type games, you can often get decent walkthroughs. But for many games, like say, Diablo II (thinking of the last strategy guide I read), the strategy guide sold in mainstream bookstores can’t get you much farther than a n00b level of play.
To actually get good, the best thing to do is to go to online forums and listen to what people who are actually experienced at the game are saying.
In the case of both social skills and video games, the best way to learn is to practice, and to get advice from the source: people who already broke down the task and are experienced and successful at it, not the watered-down crap in mainstream bookstores.
Right, but at least with video games, the rule book tells you what the game is, and what it is you’re judged on. That gives you enough to make sense of all the other advice people throw at you and in-game experience you get, which is a lot more than you can say of social life.
You effectively answered your own comment, but to clarify -
Strategy guides on dead tree have been obsolete for more than a decade. GameFAQs is over a decade old, and it’s the best place to go for strategies, walkthroughs, and message boards full of analysis by armies of deticated fans. People are still finding new and inventive strategies to optimize their first-generation Pokemon games, after all. Games have long passed the point on the complexity axis where the developper’s summary of the point of the game is enough to convey an optimal strategy.
It’s a bad analogy because there are different kinds of games, but only one kind of small talk? If you don’t think pub talk is a different game than a black tie dinner, well, you’ve obviously never played. Why do people do it? Well, when you beat a video game, you’ve beat a video game. When you win at social interaction, you’re winning at life—social dominance improves your chances of reproducing.
As for rule books: the fact that the ‘real’ rules are unwritten is part of the fun. Of course, that’s true for most video games. Pretty much any modern game’s real tactics come from players, not developers. You think you can win a single StarCraft match by reading the manual? Please.
No, pub talk is not exactly the same as a black tie dinner. The -small talk- aspect, though, very much is. It all comes down to social ranking of the participants. In the former, it skews to word assortative mating and in the latter presumably toward power and resources in the buisness world.
If you have a need or desire to win at social interaction, good for you. Please consider that for other people, it -really- isn’t that important. There is more to life than attracting mates and business partners. Those things are often a means to an end, and it is preferable to some of us to pursue the ends directly when possible.
For me, understanding “what’s really going on” in typical social interactions made them even less interesting than when I didn’t. At least back then it was a big mystery to be solved. Now I just think, what a big waste of brain cells.
Roko, do you personally find these status and alliance games interesting? Why? I mean, if you play them really well, you’ll end up with lots of allies and high status among your friends and acquaintances, but what does that matter in the larger scheme of things? And what do you think of the idea that allies and status were much more important in our EEA (i.e., tribal societies) than today, and as a result we are biased to overestimate their importance?
Well, that depends upon your axiology.
If you are concerned with existential risk, then it is worth noting that the movement has an undersupply of “people people”, a big gender imbalance and an undersupply of money. (I think that ability to make money is determined, to some extent, by your ability at these social games)
You may feel that status within a social group is an end in itself.
If you are concerned with academic learning, discovering new mathematics/philosophy, then getting better at these social games is probably not so important.
Their importance is a function of our values, which came from the EEA and are not so easily changed. Those values, like wanting friendship, community, relationships, and respect, are a part of what make us human.
I actually don’t interpret social interactions as “status and alliance games,” which is kind of cynical and seems to miss the point. Instead, I try to recognize that people have certain emotional requirements that need to be met in order to gain their trust, friendship, and attraction, and that typical social interactions are about building that type of trust and connection.
Most of what we call values seem to respond to arguments, so they’re not really the kind of fixed values that a utility maximizer would have. I would be wary about calling some cognitive feature “values that came from the EEA and are not easily changed”. Given the right argument or insight, they probably can be changed.
So, granted that it’s human to want friendship, community, etc., I’m still curious whether it’s also human to care less about these things after realizing that they boil down to status and alliance games, and that the outcomes of these games don’t count for much in the larger scheme of things.
Well, is it also human to stop desiring tasty food once you realize that it boils down to super-stimulation of hardware that evolved as a device for impromptu chemical analysis to sort out nutritionally adequate stuff from the rest?
As for the “larger scheme of things,” that’s one of those emotionally-appealing sweeping arguments that can be applied to literally anything to make it seem pointless and unworthy of effort. Selectively applying it is a common human bias. (In fact, I’d say it’s a powerful general technique for producing biased argumentation.)
Not to stop desiring it entirely, but to care less about it than if I didn’t realize, yes. (I only have a sample size of one here, namely myself, so I’m curious if others have the same experience.)
I don’t think I’m applying it selectively… we’re human and we can only talk about one thing at a time, but other than that I think I do realize that this is a general argument that can be applied to all of our values. It doesn’t seem to affect all of them equally though. Some values, such as wanting to be immortal, and wanting to understand the nature of reality, consciousness, etc., seem to survive the argument much better than others. :)
Honestly, I don’t see what you’re basing that conclusion on. What, according to you, determines which human values survive that argument and which not?
I’m surprised that you find the conclusion surprising or controversial. (The conclusion being that some some values survive the “larger scheme of things” argument much better than others.) I know that you wrote earlier:
but I didn’t think those words reflected your actual beliefs (I thought you just weren’t paying enough attention to what you were writing). Do you really think that people like me, who do not think that literally everything is pointless and unworthy of effort, have just avoided applying the argument to some of our values?
It seems obvious to me that some values (e.g., avoiding great pain) survive the argument by being hardwired to not respond to any arguments, while others (saving humanity so we can develop an intergalactic civilization, or being the first person in an eventually intergalactic civilization to really understand how decisions are supposed to be made) are grand enough that “larger scheme of things” just don’t apply. (I’m not totally sure I’m interpreting your question correctly, so let me know if that doesn’t answer it.)
Wei_Dai:
As the only logical possibilities, it’s either that, or you have thought about it and concluded that the argument is not applicable to some values. I don’t find the reasons for this conclusion obvious, and I do see many selective applications of this argument as a common bias in practice, which is why I asked.
Yes, that answers my question, thanks. I do have disagreements with your conclusion, but I grant that you are not committing the above mentioned fallacy outright.
In particular, my objections are that: (1) for many people, social isolation and lack of status is in fact a hardwired source of great pain (though this may not apply to you, so there is no disagreement here if you’re not making claims about other people), (2) I find the future large-scale developments you speculate about highly unlikely, even assuming technology won’t be the limiting factor, and finally (3) even an intergalactic civilization will matter nothing in the “larger scheme of things” assuming the eventual heat death of the universe. But each of these, except perhaps (1), would be a complex topic for a whole another discussion, so I think we can leave our disagreements rest at this point now that we’ve clarified them.
Agreed: this is an instance of the Godshatter concept
What makes the desire to obtain high status within some small group a legitimate piece of Godshatter (good), as opposed to a kind of scope insensitivity (bad)? Or to put it another way, why isn’t scope insensitivity (the non-linear way that a typical human being values other people’s suffering) also considered Godshatter?
Voted up as this is an important and general question about converting human intuitions into a formal utility function.
Do we have a general criterion for deciding these things? Or is it still unresolved in general?
In this specific case, it seems to me that there are many aspects of social interaction that are zero-sum or even negative sum. For the purpose of Coherent Extrapolated Volition, zero sum or negative sum elements are like scope insensitivity, i.e. bad.
There are clearly some social status games that are positive sum.
I think it’s unresolved in general. I brought up scope insensitivity as a counter-example to the “Godshatter” argument, or at least a strong form of it which says we should keep all of the values that evolution has handed down to us. It seems likely that we shouldn’t, but exactly where to draw the line is unclear to me. Still, to me, desire for high status in some small group seems to be the same kind of “crazy” value as scope insensitivity.
I wasn’t talking about CEV, I was mainly talking about what you or I should value, now, as individuals. I’m not sure that positive-sum/zero-sum has much to do with that.
Deciding which psychological drives to keep, and which to abandon, is the same as figuring out full formal preference (assuming you have more expressive power than just keeping/abandoning), so there is no heuristic for doing that simpler than full formal preference. This problem isn’t just unresolved, it’s almost FAI-complete (preference theory, as opposed to efficient implementation).
Can you expand on this? It isn’t logically inconsistent to want to have status…
My guess:
Status should be about gaining allies and mates, correct? Just as charity is about helping people.
Gaining more allies and mates (especially for a male) should be better than gaining fewer agreed? If so, why do maths professors spend so much time and effort trying to gain status in the small world of maths? They would be better off appealing to the lowest common denominator and using their intellect to wow people in something more accessible.
The quality of the allies also matters. Having allies that can’t help you in your chosen goals is a drain on resources.
Wei_Dai:
However, that’s not how human brains work. It’s not like someone who on an average day spends, say, eight hours doing intellectual work and four hours socializing could do 50% more useful intellectual work by spending 12 hours working instead of socializing. For the overwhelming majority of people, it’s impossible to employ their brains productively for more than a few hours a day. You get tired and lose focus to the point where you’re just making a mess instead of progress.
Similarly, if you develop skills independent of your main intellectual pursuits, it’s not like they will automatically steal resources and make you less productive. Human brain just doesn’t work that way. On the contrary, a suitable schedule of entertaining diversions can increase your productivity in your main pursuit.
Of course, there are exceptions. Some people really can spend nearly all their waking hours intensely focused and fully productive, without the need or want for anything more in their lives. However, this is a very small minority, even among people working in math, hard science, and technical professions.
That argument can be used to deny the importance of absolutely everything you do. Unless you believe that some part of you came into existence supernaturally, or you’re carrying some highly consequential recent mutation, absolutely everything in your thoughts and deeds is a result of some impulse that evolved in the EEA (although of course it might be manifesting itself in a way very different from the original in today’s environment).
I agree with everything, except:
-- unless you’re using “impulse” in a very broad sense. Plenty of thoughts and deeds (even in the System 1) are the result of your brain’s inputs in your life so far.
Merely “tuning in” to a social interaction isn’t enough. Subtextual conversations are often tedious if they’re not about you. You have to inject your ego into the conversation for things to get interesting.
They’re way more interesting than video games, for example. Or watching television. Or numerous other activities people find fun and engaging. Of course, if you’re bad at them you aren’t going to enjoy them; the same goes for people who can’t get past the first stage of Pac-Man.
Terrible analogy.
Video games have a lot of diversity to them and different genres engage very different skills. Small talk all seems to encompass the same stuff, namely social ranking.
Some of us know how to do it but just don’t -care-, and that doesn’t mean we’re in fact bad at it. I think that is the point this comment thread is going for.
Be careful when you notice more diversity in subject matter you’re a fan of than in subject matter that you’re not. I’m not sure if there’s a name for this bias, but there should be.
I would expect this people are just more familiar with what they’re a fan of, but it could also be related to outgroup homogeneity bias.
That’s definitely it. I suspect it’s too much like work for most people to pay attention to the details of things they aren’t fond of.
Your link is broken.
Oops, fixed.
My father disparages all video games as being “little men running around on a screen”.
When you do that sort of thing to people, it’s called stereotyping of the group you don’t like. I don’t know of a word for noticing distinctions in the thing or people you do like.
Could it just be characterized as a specific example of the halo effect?
There’s also the fact that video games … have a freaking rule book, which tells you things that aren’t complete fabrications designed to make you fail the game if you’re stupid enough to follow them.
I really like the idea of creating a video game with a deceptive rulebook.
I thought for a bit that it would be interesting to have, say, a WWI game where the tutorial teaches you nineteenth-century tactics and then lets you start the game by throwing massed troops against barbed wire, machineguns, and twentieth-century artillery. The slaughter would be epic.
I really like this idea too. Portal does this to some extent, but the idea could be taken much farther.
This is something that’s been discussed a few times on LW, but I don’t think it’s accurate. I don’t think there are two sets of rules, a “real” one and a “fake” one. Rather, I think that the rules for social interaction are very complicated and have a lot of exceptions, and any attempt to discuss it will inevitably be oversimplified. Temple Grandin’s book discusses this idea: all social rules have exceptions that can’t be spelled out in full.
The status test (actually a social skills test) isn’t to see if you fail by being stupid enough to follow the “fake” rules rather than the “real ones”. It’s to see if you’re savvy enough to understand all the nuances and exceptions to the rules.
Not disagreeing with your general point, but...
...with video games, the printed, widely available strategy guides often tend to be lacking. For adventure games or Final Fantasy-type games, you can often get decent walkthroughs. But for many games, like say, Diablo II (thinking of the last strategy guide I read), the strategy guide sold in mainstream bookstores can’t get you much farther than a n00b level of play.
To actually get good, the best thing to do is to go to online forums and listen to what people who are actually experienced at the game are saying.
In the case of both social skills and video games, the best way to learn is to practice, and to get advice from the source: people who already broke down the task and are experienced and successful at it, not the watered-down crap in mainstream bookstores.
Right, but at least with video games, the rule book tells you what the game is, and what it is you’re judged on. That gives you enough to make sense of all the other advice people throw at you and in-game experience you get, which is a lot more than you can say of social life.
You effectively answered your own comment, but to clarify -
Strategy guides on dead tree have been obsolete for more than a decade. GameFAQs is over a decade old, and it’s the best place to go for strategies, walkthroughs, and message boards full of analysis by armies of deticated fans. People are still finding new and inventive strategies to optimize their first-generation Pokemon games, after all. Games have long passed the point on the complexity axis where the developper’s summary of the point of the game is enough to convey an optimal strategy.
Your last paragraph is gold.
It’s a bad analogy because there are different kinds of games, but only one kind of small talk? If you don’t think pub talk is a different game than a black tie dinner, well, you’ve obviously never played. Why do people do it? Well, when you beat a video game, you’ve beat a video game. When you win at social interaction, you’re winning at life—social dominance improves your chances of reproducing.
As for rule books: the fact that the ‘real’ rules are unwritten is part of the fun. Of course, that’s true for most video games. Pretty much any modern game’s real tactics come from players, not developers. You think you can win a single StarCraft match by reading the manual? Please.
No, pub talk is not exactly the same as a black tie dinner. The -small talk- aspect, though, very much is. It all comes down to social ranking of the participants. In the former, it skews to word assortative mating and in the latter presumably toward power and resources in the buisness world.
If you have a need or desire to win at social interaction, good for you. Please consider that for other people, it -really- isn’t that important. There is more to life than attracting mates and business partners. Those things are often a means to an end, and it is preferable to some of us to pursue the ends directly when possible.
The video game analogy is just plain bad.