OK, I upvoted it before reading, and now that I have read it, I wish there were a karma transfer feature, so I could upvote it a dozen times more :) Besides the excellent content, it is exemplary written ( engaging multi-level state-explain-summarize style, with quality examples throughout).
By the way, speaking of karma transfer, here is one specification of such a feature: anyone with, say, 1000+ karma should be able to specify the number of upvotes to give, up to 10% of their total karma (diluted 10x for Main posts, since currently each Main upvote gives 10 karma points to OP). The minimum and transfer thresholds are there to prevent misuse of the feature with sock puppets.
Now, back to the subject at hand. What you call compartmentalization and what jimmy calls attention shifting I imagine in terms of the abstract data type “stack”: in your current context you create an instance of yourself with a desired set of goals, then push your meta-self on stack and run the new instance. It is, of course, essential that the instance you create actually pops the stack and yields control at the right time, (and does not go on creating and running more instances, until you get a stack overflow and require medical attention to snap back to reality). Maybe it’s an alarm that goes off internally or externally after a certain time (a standard feature in hypnosis), and/r after a certain goal is achieved (e.g. 10 pages of a novel).
Also, as others pointed out, your ideas might be a better sell if they are not packaged as Dark Arts, which they are not, but rather as being meta-rational. For example, “local irrationality may be globally rational”, or, somewhat more mathematically, “the first approximation to rationality need not appear rational”, or even more nerdy “the rationality function is nonlinear, its power series has a finite convergence radius”, or something else, depending on the audience. Then again, calling it Dark Arts has a lot of shock value in this forum, which attracts interest.
What you call compartmentalization and what jimmy calls attention shifting I imagine in terms of the abstract data type “stack”
To clarify, what he calls compartmentalization I call compartmentalization. I’d just recommend doing something else, which, if forced to name, I’d call something like “having genuine instrumental goals instead of telling yourself you have instrumental goals”.
When I say “attention shifting” I’m talking abut the “mental bit banging” level thing. When you give yourself the (perhaps compartmentalized) belief that “this water will make me feel better because it has homeopathic morphine in it”, that leads to “so when I drink this water, i will feel better” which leads to anticipating feeling better which leads to pointing your attention to good feelings to the exclusion of bad things.
However, you can get to the same place by deciding “i’m going to drink this water and feel good”—or when you get more practiced at it, just doing the “feeling good” by directing attention to goodness without all the justifications.
(that bit of) my point is that knowing where the attention is screens off how it got there in terms of its effects, so might as well get there through a way that does not have bad side effects.
LW already feels uncomfortably polarized with a clique of ridiculously high karma users at the top. I don’t think giving additional power to the high-karma users is a good idea for the long term.
LW already feels uncomfortably polarized with a clique of ridiculously high karma users at the top.
Huh. never noticed that. A clique? What an interesting perspective. How much karma do you mean? Or is it some subset of high karma users? For example, I happen to have just over 10k karma, does it make me a clique member? What about TheOtherDave, or Nancy? How do you tell if someone is in this clique? How does someone in the clique tell if she is?
Presumably you joined a while ago, when there weren’t so many intimidating high-karma users around
For example, I happen to have just over 10k karma, does it make me a clique member? What about TheOtherDave, or Nancy?
Yes on all counts. You’re clearly the cool kids here.
How do you tell if someone is in this clique?
You see them talk like they know each other. You see them using specialized terms without giving any context because everybody knows that stuff already. You see their enormous, impossible karma totals and wonder if they’ve hacked the system somehow.
How does someone in the clique tell if she is?
Dunno. It probably looks completely different from the other side. I’m just saying that’s what it feels like (and this is bad for attracting new members), not that’s what it’s really like.
Well, for example, I’m as aware of the differences between me and shminux as I ever was. From my perspective (and from theirs, I suspect), we aren’t nearly as homogenous as we apparently are from lmm’s perspective.
For the record, I have also noticed this subset of LW users—I tend to think of them as “Big Names”—and:
You could ask the same of any clique;
It seems like these high-profile members are actually more diverse in their opinions than mere “regulars”.
Of course, this is just my vague impression. And I doubt it’s unique to LessWrong, or particularly worrying; it’s just, y’know, some people are more active members of the community or however you want to phrase it.
(I’ve noticed similar “core” groups on other websites, it’s probably either universal or a hallucination I project onto everything.)
Relevant link: The Tyranny of Structurelessness (which is mostly talking about real-life political groups, but still, much of it is relevant):
Contrary to what we would like to believe, there is no such thing as a structureless group. Any group of people of whatever nature that comes together for any length of time for any purpose will inevitably structure itself in some fashion [...]
Elites are nothing more, and nothing less, than groups of friends who also happen to participate in the same political activities. They would probably maintain their friendship whether or not they were involved in political activities; they would probably be involved in political activities whether or not they maintained their friendships. It is the coincidence of these two phenomena which creates elites in any group and makes them so difficult to break.
These friendship groups function as networks of communication outside any regular channels for such communication that may have been set up by a group. If no channels are set up, they function as the only networks of communication. Because people are friends, because they usually share the same values and orientations, because they talk to each other socially and consult with each other when common decisions have to be made, the people involved in these networks have more power in the group than those who don’t. And it is a rare group that does not establish some informal networks of communication through the friends that are made in it. [...]
Because elites are informal does not mean they are invisible. At any small group meeting anyone with a sharp eye and an acute ear can tell who is influencing whom. The members of a friendship group will relate more to each other than to other people. They listen more attentively, and interrupt less; they repeat each other’s points and give in amiably; they tend to ignore or grapple with the “outs” whose approval is not necessary for making a decision. But it is necessary for the “outs” to stay on good terms with the “ins.” Of course the lines are not as sharp as I have drawn them. They are nuances of interaction, not prewritten scripts. But they are discernible, and they do have their effect. Once one knows with whom it is important to check before a decision is made, and whose approval is the stamp of acceptance, one knows who is running things.
I can’t see people using it to the point where their karma-flow went negative, so I don’t think it really helps. It’s a less bad idea, but not I think a good one.
OK, I upvoted it before reading, and now that I have read it, I wish there were a karma transfer feature, so I could upvote it a dozen times more :) Besides the excellent content, it is exemplary written ( engaging multi-level state-explain-summarize style, with quality examples throughout).
By the way, speaking of karma transfer, here is one specification of such a feature: anyone with, say, 1000+ karma should be able to specify the number of upvotes to give, up to 10% of their total karma (diluted 10x for Main posts, since currently each Main upvote gives 10 karma points to OP). The minimum and transfer thresholds are there to prevent misuse of the feature with sock puppets.
Now, back to the subject at hand. What you call compartmentalization and what jimmy calls attention shifting I imagine in terms of the abstract data type “stack”: in your current context you create an instance of yourself with a desired set of goals, then push your meta-self on stack and run the new instance. It is, of course, essential that the instance you create actually pops the stack and yields control at the right time, (and does not go on creating and running more instances, until you get a stack overflow and require medical attention to snap back to reality). Maybe it’s an alarm that goes off internally or externally after a certain time (a standard feature in hypnosis), and/r after a certain goal is achieved (e.g. 10 pages of a novel).
Also, as others pointed out, your ideas might be a better sell if they are not packaged as Dark Arts, which they are not, but rather as being meta-rational. For example, “local irrationality may be globally rational”, or, somewhat more mathematically, “the first approximation to rationality need not appear rational”, or even more nerdy “the rationality function is nonlinear, its power series has a finite convergence radius”, or something else, depending on the audience. Then again, calling it Dark Arts has a lot of shock value in this forum, which attracts interest.
To clarify, what he calls compartmentalization I call compartmentalization. I’d just recommend doing something else, which, if forced to name, I’d call something like “having genuine instrumental goals instead of telling yourself you have instrumental goals”.
When I say “attention shifting” I’m talking abut the “mental bit banging” level thing. When you give yourself the (perhaps compartmentalized) belief that “this water will make me feel better because it has homeopathic morphine in it”, that leads to “so when I drink this water, i will feel better” which leads to anticipating feeling better which leads to pointing your attention to good feelings to the exclusion of bad things.
However, you can get to the same place by deciding “i’m going to drink this water and feel good”—or when you get more practiced at it, just doing the “feeling good” by directing attention to goodness without all the justifications.
(that bit of) my point is that knowing where the attention is screens off how it got there in terms of its effects, so might as well get there through a way that does not have bad side effects.
LW already feels uncomfortably polarized with a clique of ridiculously high karma users at the top. I don’t think giving additional power to the high-karma users is a good idea for the long term.
Huh. never noticed that. A clique? What an interesting perspective. How much karma do you mean? Or is it some subset of high karma users? For example, I happen to have just over 10k karma, does it make me a clique member? What about TheOtherDave, or Nancy? How do you tell if someone is in this clique? How does someone in the clique tell if she is?
Presumably you joined a while ago, when there weren’t so many intimidating high-karma users around
Yes on all counts. You’re clearly the cool kids here.
You see them talk like they know each other. You see them using specialized terms without giving any context because everybody knows that stuff already. You see their enormous, impossible karma totals and wonder if they’ve hacked the system somehow.
Dunno. It probably looks completely different from the other side. I’m just saying that’s what it feels like (and this is bad for attracting new members), not that’s what it’s really like.
(nods) True enough.
That said, you’re absolutely right that it looks completely different from “the other side.”
What does it look like?
Well, for example, I’m as aware of the differences between me and shminux as I ever was. From my perspective (and from theirs, I suspect), we aren’t nearly as homogenous as we apparently are from lmm’s perspective.
For the record, I have also noticed this subset of LW users—I tend to think of them as “Big Names”—and:
You could ask the same of any clique;
It seems like these high-profile members are actually more diverse in their opinions than mere “regulars”.
Of course, this is just my vague impression. And I doubt it’s unique to LessWrong, or particularly worrying; it’s just, y’know, some people are more active members of the community or however you want to phrase it.
(I’ve noticed similar “core” groups on other websites, it’s probably either universal or a hallucination I project onto everything.)
Relevant link: The Tyranny of Structurelessness (which is mostly talking about real-life political groups, but still, much of it is relevant):
I always assumed I’d get a black cloak and a silver mask in the mail once I break 10K and take the Mark of Bayes. Isn’t that what happens?
We aren’t supposed to talk about it.
What if going beyond 1 vote cost karma—so you’d actually need to spend, not just apply, that karma?
I can’t see people using it to the point where their karma-flow went negative, so I don’t think it really helps. It’s a less bad idea, but not I think a good one.