A useful idea I’ve been looking into more lately, is “weirdness points”. Basically, some good ideas are really unusual. People are often biased against unusual ideas. It’s often seen to be easier to fight for one weird thing, than to fight for multiple weird things. Therefore, we ought to prioritize what we fight for, so the most important things get our weirdness points, and tamper down on the weirdness in other areas of our lives/work.
Typical social explanations of weirdness points aren’t completely helpful. Power, status, and wealth would seem to bestow weirdness points. But politicians, celebrities, and wealthy people aren’t always as free from the weirdness constraint as would be guessed.
Maybe communities and media are fracturing so much that weirdness points are more dependent on your community than your actions. (The social psych idea, “idiosyncrasy credits”, is defined in terms of a group’s expectations, not those of society-at-large or people-who-are-reachable-but-not-already-on-your-side.)
Weirdness points seem like a valuable (and limited) resource, especially if you are promoting or enacting multiple ideas (A.I. safety and improving rationality and open borders, for example). As with anything valuable to our goals, we ought to figure out if we can get it, and at what cost.
So, the questions for discussion:
What actually determines weirdness points?
Are weirdness points important or useful or accurate, as a predictive model? How constrained are people’s actions, really, in promoting weird ideas? In what contexts?
How can one gain weirdness points?
Has anyone at any time “hacked” weirdness points, by successfully promoting multiple weird things / having weird habits / having a weird personality, without eventually running their status/credibility into the ground? (The only person I can think of offhand might be Richard Feynman.)
People do not punish nonconformity per se. People punish nonconformity iff it is a problem. If someone punishes you for being weird then that means your weirdness has caused a problem. If you can stop causing problems for other people then you can get away with being weird.
I walk around barefoot outside where there is broken glass. Instead of hosting my personal website on WordPress, I created my own content management system…in Lisp. I wrote this answer in Vim through i3 on a Linux machine. I am a heretical savant high on cocaine. I wrote a series of posts on how to become even weirder. Yesterday, I stared at a grass field for so long my eyes malfunctioned.
I get away with being weird because I do not cause problems for other people. The value of keeping me around outweighs the cost.
Promoting unpopular ideas turns you into a problem.
Fighting the ordinary people around you turns you into a problem. The simplest way to preserve weirdness points is to not fight for things.
Promoting unpopular ideas costs social capital. How much you can influence other people is a good definition of social capital. If you want to get away with disruptive activities then you can increase your social capital or minimize the disruption you cause.
It’s very hard to know to what extend one gets punished for nonconformity. You don’t know about the event invitations that you didn’t get because you were seen as being too weird.
This is way clearer thinking than I previously had about this topic. Thank you!
People are more likely to be vocal about punishing non-conformity if it is a problem. But I think there’s a thing very much like an anti-holo-effect that surrounds people who are perceived as weird.
After observing too many cases of non-conforming people causing problems, people may update and start punishing non-conformity directly.
“Weirdness” is probably a bad abstraction, because it includes things with opposite effect. From statistical perspective, being a king is weird, being a billionaire is weird, being a movie star is weird. Yet this is obviously not what we mean when talking about carefully spending our weirdness points.
Here is a hypothesis I just made up, with no time to test it: Maybe people instinctively try to classify everyone else into three buckets: “higher-status than me”, “the same as me”, and “lower-status than me”. The middle bucket is defined by similarity: if you are sufficiently similar to me, you are there. If you are dissimilar, the choices are only “higher” and “lower”. (In other words, the hypothesis is that the instinctive classification does not support the notion of “different but equal”.) Because you do not assign people higher status for no reason, it follows that if you are different from me, and there is no evidence of you being higher-status than me, then I will perceive and treat you as lower-status. And if you refuse to be treated as lower-status, I will punish you for acting above your status.
From this model, it follows that for people visibly above you, weirdness is not a problem. You expect the king to have unusual manners and opinions compared to the peasants. It is the weird peasant everyone makes fun of.
The answer then is that you must achieve superior status first, and show your weirdness later. Then people will assume that these things are related.
Weird is no statistical term and saying that some notion of weirdness that’s a statistical abstraction is a bad abstraction has little to do with whether the concept in it’s usual sense is a good abstraction.
I think it’s best to view weirdness points as a fake framework.
I don’t think there is, at any level of abstraction, an accurate gears level model that includes weirdness points as a gear. But, if you’re just trying to make quick and dirty heuristics about what you can get away with, it’s an excellent heuristic.
When you’re looking at the gears of this phenomena, I think you start looking at signaling and countersignaling, which will give you more accurate answers than trying to count weirdness points.
Given that you can’t have a quark level model , what counts as a gear level model?
A model that makes accurate predictions at a given level of abstraction. and can handle may cases at that level E.g. if the level of abstraction is “human behavior” (rather than say, quarks) it should give accurate predictions about the human behavior abstraction.
What you are talking about is a function of given-level-of-description, not an absolute. So there is a level of abstraction where “weirdness points” works.
Probably, but not a very useful one. It’s better to just use natural levels of abstraction like “human behavior” and recognize that this is not a gears level model for that level, but a heuristic. I can’t really think of a natural abstraction where weirdness points is usefully gearsy, rather than a heursitic.
“gears level” is defined in terms of usefulness, and so is “heuristic”
Gears level is defined by prediction power at a given level of abstraction, heuristic is defined by something like… “speed/prediction power” at a typical level of abstraction, or something. Whether you want gears or heuristics really depends on how much time you have and how much time you’re going to spending with the model (typically heuristics).
Are they really different? Why would you want to use a high abstraction level if not to save Compute?
Yes.
You do want to use it to save compute.
So why are they different?
I loudly promote a large number of rather contentious ideas. In particular, I am an animal right hardliner (an active member of Direct Action Everywhere) and a socialist top of the big rationalist stereotypes (singularity is near, poly, etc). I certainly annoy a lot of people but socially I am doing well. I have many friends, an amazing long term relationship, and am doing well financially. You can read my blog to see the sort of beliefs I promote.
It is unclear why this works out for me. I look rather average which might help? Plausible I have some sort of social skills that help me smooth things over if they get too hot. I handle conflict fairly well. It seems empirically true many people are socially successful despite having extremely controversial. In some cases it seems to help them?