Well in my case the thing that stands out is that the ADHD diagnosis was given after a very quick/superficial evaluation, whereas the AS diagnosis came after many months of testing, evaluation, and thorough analysis of my developmental history. I cannot exactly speak to whether the two configurations can or cannot coexist in the same person without further study, but my suspicion is that AS and ADHD can appear superficially similar to adults who are observing children and teenagers, merely because of the fact that the child/teenager does not appear to be attending to what the adult wishes they were.
AnneC
Full disclosure: I have been diagnosed with Asperger’s (and prior to that, PDD-NOS and ADHD). I am also female and 31 years old at the time of this writing.
All that said, one thing I persistently have trouble with is thinking in terms of “isms” in the first place (well, with the exception of “autism”, though that is a neurological configuration rather than an ideology). Hence I have no idea, really, whether my default mode(s) of thought fall into the “utilitarian/consequentialist” schools, and I have a very difficult time following the sorts of discussions wherein people are constantly trying to figure out whether a given notion fits in with this-ism or that-ism, or even where folks seem to be worrying about keeping everything they decide in line with a given, externally-sourced “system” of organizing ideas.
(This, incidentally, is why I stopped identifying as a “transhumanist”—I just could not figure out what the word meant, it seemed to “lose meaning” the more I examined it, and eventually I determined the energy expenditure of even continuing to attempt discerning that thing was not worth it, so I disconnected myself from identification with the term. I still maintain my strong interests in longevity, human-machine interface, and whatnot, but I do not believe they NEED an over-arching ideology or what-have-you in order for my interest to be legitimate).
As far as making ethical decisions goes for me, my impression (as much as my own insight can be considered reliable here; I don’t know for sure) is that I do usually invoke certain very basic principles (bodily autonomy, for instance) but that I tend to consider specific situations on a very individual basis each time, without concerning myself so much over being “consistent”. Different situations can certainly share pattern-elements with one another, of course, and I can notice that, but for the most part—and I don’t know if it is a language processing thing or what—I seem to have more trouble than most people on this site (who frequent it) with “ism”-based discussions. Moreover, while I believe my thinking to be quite rational and logical most of the time, I sort of burned out on heavy debate/argument over the 2006-2008 time frame and hence I say less about “heavy” subjects on the Internet than I used to these days,
EDIT: …and the point of all that was to basically suggest (albeit without reference to data other than my own observations and pattern-identification skills, so take the suggestion for whatever you think it is worth on that basis) that while you may indeed find SOME correlation between AS/autism and whatever you consider to be particularly “utilitarian” or “consequentialist” thought, my take is that this is only one specific possible manifestation of “autistic specialization”. Which is to say that some of us may indeed specialize in more abstract areas, however, there are also those of us who remain welded to the “concrete” and hence are less likely to be found as, say, regular LW commenters. Personally I identify, for instance, more with the “engineer” than the “philosopher” archetype, though that has little to no bearing on the presence of otherwise-inclined autistic persons in this or other forums.
This. I’m not “creeped out” by people merely talking about PUA techniques, but I do find it boring, irrelevant, and pretty much useless in terms of any capacity to improve my thinking abilities. I don’t think all examples / analogies used to make a point about rationality, etc., need to be things everyone can identify with (that would likely be impossible anyway), but PUA stuff really is sort of distractingly specific to the “hetero males trying to score hot chicks” demographic. I’d just as soon be reading about how to choose the best golf shoes.
Data point: I am physically (and I am figuring, genotypically) female but have never felt that I have an “internal feminine identity” of any kind. I used to think the whole idea of such an internal identity was a socially-imposed myth. It was not until I encountered trans women / trans men who very, very clearly had an internal identification that strongly differed from their sex phenotype that it became evident to me that some people (and possibly most cisgendered persons, even) really and truly did have an internal gender “sense”.
Look up James Tiptree Jr. (the pseudonym used by sf writer Alice Sheldon) for a great example of a female sf author who “passed” not only as male, but as manly (in the opinion of many men who read her work) until her true identity was revealed.
Another nit about drivers’ licenses (full disclosure: I don’t have one, and I live in the USA): from what I’ve seen, drivers’ license as an indicator of “real world success” is a very American phenomenon. Anecdotally, the Europeans I’ve encountered seem significantly less likely than Americans of the same age to have licenses on average, nor is there a stigma (or as much of a stigma) associated with not having one.
No one person is “in charge of the future of humanity”. I know you were probably being somewhat flippant, but still.
Well, for one thing, privilege is a major source of bias, and when a person doesn’t even realize they (or those they admire) have particular types/levels of privilege, they’re going to have a harder time seeing reality accurately.
E.g., when I was younger, I used to think that racism didn’t exist anymore (that it had been vanquished by Martin Luther King, or something, before I was even born) and didn’t affect anyone, and that if someone didn’t have a job, they were probably just lazy. Learning about my own areas of privilege made it possible for me to see that things were a lot more complicated than that.
Of course it’s possible for people to go too far the other way, and end up totally discounting individual effort and ability, but that would fall under the category of “reversed stupidity” and hence isn’t what I’m advocating.
(And that’s all I’m going to say in this thread for now—need to spend some more time languaging my thoughts on this subject.)
For what it’s worth I don’t think you’ve deliberately set out to become a “cult leader”—you seem like a sincere person who just happens to be going about life in a rather nonstandard fashion. You’ve got some issues with unacknowledged privilege and such, and I’ve gotten impressions from you of an undercritical attractance to power and people who have power, but that’s hardly unique.
I think mostly it’s that you confuse people via sending off a lot of signals they don’t expect—like they think you must have some weird ulterior motive for not having gone to college, and instead of seeing public discussion of your own intellect as merely the result of somewhat atypical social skills, it’s seen as inexcusable arrogance.
That said, because of my own negative experience(s) with people who’ve seemed, shall I say, rather “sparkly” at first, but who HAVE turned out to be seeking puppy-dog supplicants (or worse), I tend to be very very cautious these days when I encounter someone who seems to attract a fan club.
With you I’ve gone back and forth in my head many times as to whether you are what you first struck me as (a sincere, if a bit arrogant, highly ambitious guy) or something more sinister. It’s been difficult to tell as you’re sort of surrounded by this buzzing cloud of subcultural interference, but at this point I’ve sort of determined that if there’s anything sinister there it’s not a special sort above and beyond what you’d find in any given random middle/upper class American geek.
I think you get called out as a symbol of “smartypants white boys obsessed with trying to save the world from their basements” because you’ve ended up more visible than most. But, no, that doesn’t make you a cult leader, it just makes you someone who would (like many of us living in wealthy, industrialized nations) benefit from making a greater effort to understand the effects of power and privilege.
It was some kind of “neurolinguistic programming” thing. This particular incarnation of it entailed my first being yelled at until “[my] defenses were stripped away”, at which point I was supposed to accept this guy as a “master”. Later on it supposedly involved attending weird summer-camp type sessions where I was told people would undergo things that “felt like torture” but which they’d “come to appreciate”.
I didn’t go to any camp sessions and probably wouldn’t have attended them anyway for sheer lack of logistical finesse, but I am glad I had a co-worker point out to me that what was happening to me was emotional abuse at the very least.
Well I don’t know that I’ve got any “rationalist” cred, but as someone who at least attempts to approach life rationally, I am personally terrified by the prospect of being part of a cult because of the way cults seem to warp people’s capacity for thinking straightforwardly about reality. (And I could easily lump “religion” in with “cult” here in that regard).
Basically, I don’t like the way things I’d call “cultish” seem to disconnect people from concrete reality in favor of abstractions. I’ve seen some truly awful things happen as a result of that sort of mindset, and have also myself experienced an attempt at “indoctrination” into a sort of cult, and it was one of the worst experiences of my life. A person I knew and thought I could trust, and who seemed smart and reasonable enough, one day managed to trap me in an office under false pretenses and basically sat there berating me and telling me all kinds of horrible things about my character for two hours. And by the end of it, I was halfway ready to believe it, and my confidence and ability to do my schoolwork (this was in college) suffered for months afterward.
So I’m terrified of cults because I know how normal and reasonable their agents can seem at first, and how perfectly horrendous it is to find out what’s actually going on, and how difficult it can be afterward to pick up the pieces of your brain and go forward with your life. I don’t give a crap about the social-status stuff (well, beyond not wanting to be harassed, if that counts), I just don’t want anyone messing with my mind.
My recommendations (in no particular preference order):
1.) “Momo”, by Michael Ende. Like another commenter, I wish I’d read this one younger.
2.) “The Neverending Story”, also by Michael Ende. The novel (which was originally written in German, but the English translation I have seems decent enough) is far more complex and interesting than the movie, and I suspect a fair number of people on here would find the “world-building” sequences quite compelling. There’s a lot in the novel (again, which doesn’t translate through to the movie) that goes deeply into questions of what it actually means to be happy, how one might actually make others happy, and what the consequences (both positive and negative) can be of enacting wishes.
3.) The “His Dark Materials” trilogy. Yet another one I wish I’d read when younger (I actually only read these recently).
4.) “A Wrinkle in Time” (along with “A Wind In The Door” and “A Swiftly Tilting Planet”), by Madeleine L’Engle. These I did read as a youngster, and while they do occasionally invoke a certain amount of Christian imagery, it’s not nearly as heavy-handedly done as it is in, say, C.S. Lewis’ works.
“Wrinkle” was especially dear to me growing up as the main character a socially awkward female math nerd (which is highly unusual for a book written in the 1960s).
And some of the other books, notably “A Swiftly Tilting Planet”, did actually help me in terms of being able to seriously question religious dogma, as there’s a plot-thread involving a villainous pastor and at the time I read that (around age 12) I was kind of shocked by it initially but then started realizing how a lot of evil in the real world probably stemmed from indiscriminate application of dogma.
I also liked that the books portrayed, in general, the science and math folks as being the good guys, and indicated that trying to understand reality was not a bad thing and did not make people cold or evil.
4.) “The Dark is Rising Sequence”, by Susan Cooper. Major thing absorbed from this series: sometimes even “good guys” do things that sound and appear awful. This doesn’t mean one has to blindly agree with them, but it does mean that sometimes when there are two major opposing forces, you can’t always just figure that throwing in your lot with one of them is going to preserve all your dearest values.
I agree (and I see sex/gender as far more valid of a biological concept than “race”, for the record), but I’ve noticed a correlation between people who would describe themselves in terms like “race realist” and people who think there’s good evidence for women being “less suited” to math and science than men, cognitively speaking. (And again, getting deeply into this right now is not something I’m going to do, it would be wandering off-topic for one thing.)
Responding to the question “What do you believe that most people on this site don’t?”:
I believe that people who try and sound all “edgy” and “serious” by intoning what they believe to be “blunt truths” about race/gender differences are incredibly annoying for the most part. I just want to roll my eyes when I see that kind of thing, and not because I’m a “slave to political correctness”, but because I see so many poorly defined terms being bandied about and a lot of really bad science besides.
(And I am not going to get into a big explanation right here, right now, of why I think what I think in this regard—I’m confident enough in this area here to take whatever status hit my largely-unqualified statement above brings. If I write an explanation at some point it will be on my own terms and I frankly don’t care who does or doesn’t think I’m smart in the meantime.)
Well, one thing to keep in mind is that most kids aren’t taught about Santa because their parents are trying to set up a rationalist epiphany opportunity for them. Rather, they’re taught about Santa because, well, the parents themselves were probably taught about Santa (and God, for that matter) when they were kids, and they probably just figure it’s one of those things you do when you have children.
Plus (and I think there might have been an OB post about this once), many adults find ignorance/innocence of certain types in children to be “cute” or appealing in some way. I think the appeal of the Santa mythos for some parents is that it feels to them like they are giving their child a chance, if only a brief one, to live in a world where “magic” actually exists.
In any case, I got in trouble on multiple occasions growing up for talking about how the Easter Bunny wasn’t real, how Minnie Mouse (at Disneyland) was a human in a suit, etc., in front of younger kids. That probably confused me more than anything else, more than the fact of having been told things that weren’t true to begin with—I felt like I was being pressured to perpetuate some weird group fantasy and had a terrible time figuring out what I’d supposedly done “wrong”. I mean, you can still hunt for Easter eggs and exchange presents and have fun at a character-themed park with full knowledge that actual humans (and not supernatural entities or magical anthropomorphic animals) are behind the whole thing.
All that said, I don’t know if it’s possible to extrapolate out the role being taught Santamyths and Godlore as a child might play in someone’s adolescent and adult rationality. I haven’t looked to see whether a large scale survey has even been conducted, but I bet the results of such a survey would be interesting.
In my own limited sample set (consisting of myself and various people whose upbringing I know a little bit about), there doesn’t seem to be a major correlation between the type/level of Santa mythos they were exposed to and how much they value truth, how much they appreciate/understand science, what their thoughts are on evolution (for example), etc. What seems to be a far more influential factor is whether the kid has opportunities to actually confront reality without necessarily being sheltered by privilege or convention in certain respects—e.g., the more spoiled kids I went to elementary school with seemed totally uncurious about how things physically worked, what they were made of, etc.
So while I probably wouldn’t personally tell my kids (and I don’t have or want kids, but we’re talking hypothetically here) that Santa was real—I’d feel silly and fake if I did—I don’t think parents wanting to raise little rationalists need to dwell extensively on the Santa question so much as make sure their kids actually learn about things like cause and effect and basic physics and such through experience.
Another thing to consider is the fact that being tired, or “half-asleep”, or in that twilight state one might manage to maintain when getting up to use the restroom or fetch water at night, is different from being in a state of normal waking consciousness.
Even if the skeptics attempting to spend the night in the “haunted” house don’t plan on actually sleeping, unless they’re already night-shift workers or have otherwise pre-configured their body clocks so that they’ll be awake all night, they are likely to at some point during the night start suffering impaired judgment. I suspect that the nearer a person gets physiologically to a state where their brain might initiate dreaming, the more difficult it becomes to maintain rationality.
I’ve noticed that when I get up at night after having been asleep, even in my own familiar apartment, I have a type and level of wariness that is not present during my normal waking conscious state. E.g., I find myself reluctant to stare into mirrors or peer behind the shower curtain, or look out the window, and I also find myself practically running back to bed after getting up because of an unnerving feeling that something might “get” me and that somehow being under the covers will make me “safe”.
Having done a fair bit of brainhacking in my life, I am now at the point where when I’m in this state I can recognize it as “that nighttime thing” and not take it too seriously, but it nevertheless continues to affect my behavior a little. I am curious now about whether it might be interesting to try getting up in the middle of the night and forcing myself to do all the things that make me jumpy, and whether this might be the sort of pre-exercise that might help someone stay in a “haunted” house overnight (presuming for the moment that we are not talking about a house that harbors escaped murderers or rabid squirrels).
OK, upon reading the experimental premise (I blocked out the rest of the text below that so it wouldn’t influence me) the very first idea, the idea that seemed most obvious to me, was to bet on blue every time.
I basically figured that if I had 10 cards, and 7 of them were blue, and I had to guess the color of all the cards at once (rather than being given them sequentially, which would give me the opportunity to take notes and keep track of how many of each had already appeared), then the most reliable way of achieving the most “hits” would be to predict that each card would be blue. That way I’d be guaranteed a correct answer as to the color of 7 of the 10 cards.
At the same time I’d know I’d be wrong about 3 of the cards going into the experiment, but this wouldn’t concern me if my goal was to maximize correct answers, and I was given only the information that 70% of the cards were blue while 30% were red, and that they were arranged in a random order. Short of moving outside the conditions of the experiment (and trying to, for instance, peek at the cards), there simply isn’t any path to information about what’s on them.
Now, if it were a matter of, “Guess the colors of all the cards exactly or we’ll shoot you”, I’d be motivated to try and find ways outside the experimental constraints—as I’m sure most people would be. It would be interesting, though, to test people’s conviction that their self-made algorithms were valid by proposing that scenario. Obviously not actually threatening people, but asking them to re-evaluate their confidence in light of the hypothetical scenario. I’d be curious to know if most people would be looking for ways to obtain more information (i.e., “cheat” per the experiment), or whether they’d stick to their theories.
You find out how to disable pieces of yourself. Then one day you find that you’ve disabled too much.
This definitely happened to me. I realized at some point (a few years ago) that in trying to force-fit myself into roles I thought I needed to fill in order to be a “responsible person”, I’d succeeded in turning off aspects of my brain that are actually pretty vital for me to learn effectively. I didn’t do anything about this realization, though, until I experienced an Epic Fail that showed me that the way I was trying to operate was neither useful nor sustainable. The big turning point for me was in moving away from seeing myself as “a broken version of normal” to “a different kind of thing entirely”.
I am not talking about becoming complacent, mind you—quite the opposite, as I’d gotten to the point where I kept running into all kinds of walls whenever I wanted to get something done, and I was very tired of having that happen.
It was as if my native operating system was different from the operating system I’d been trying to run in emulation mode for many years, to the point where things started running a lot more smoothly once I stripped away the emulation processes and got re-acquainted with my native architecture.
Increasingly, as one ages, one worries more about what one DOES, rather than about abstract characterizations of one’s capability.
This definitely happened to me. Between the ages of about 10 − 14, I was utterly obsessed with finding out what my IQ was. Somehow, somewhere along the way, I’d picked up the notion that Smartness in quantity was the most important thing a person could possibly have.
And it drove me frankly batty not knowing how much Smartness I had, because (a) I was insecure and felt like I needed to find out I had a “high enough” number in order to permit myself any sense of self-worth, and (b) I had an idea fixed in my mind that only “geniuses” with IQs 150 or above could have any hope of addressing any of the interesting questions and topics that dominated my thoughts as a geeky little kid: faster-than-light travel, Grand Unified Theories, etc.
I spent a lot of time trying to find any papers/reports/test scores my parents might be hiding away, hoping that I’d be able to discover through doing this some idea of the quantitative value stamp I was convinced must be on my brain somewhere (though not directly viewable by me).
I didn’t actually find any of these papers until I was in my late teens, and by then I found with some surprise that I didn’t care all that much what they said. At some point between the ages of 14 and 17 I’d managed to get over my IQ obsession and move toward a different brain-related obsession (one considerably less worry-inducing): that of how brains, and in particular mine, worked at all. And in ceasing to be obsessed with quantitative test-based measurements, lo and behold, I found it far easier to actually think about things and just plain learn.
I do now know what my age-4 Weschler score was, and it wasn’t 150. Not even close. I took another Weschler (the adult scale) in college, and while that score ended up being quite a bit higher than my age-4 score, it was still lower than I’d originally hoped it would be. But it didn’t matter to me in the least from an emotional standpoint by then, because I’d already managed to accomplish things (like getting an A in calculus) that I’d have considered the province of people with far higher IQ scores than I actually had. Not to mention the fact that when I looked at my subtest scores, they were all over the map—I had a higher than average Block Design, but lower than average Picture Arrangement, for instance.
At this point I tend to see IQ (at least as measured on tests) as being very limited in terms of what information it actually tells you about what someone is capable of doing. E.g., I don’t think IQ scores can definitively tell you when someone is going to “hit a wall”, so to speak, in terms of what mathematical theorem they will absolutely get stuck on when they encounter it (or what engineering problem they might be able to solve, etc.).
It almost seems like some of these posts are suggesting a desire for much greater predictive ability than any test or ten-minute impression could possibly actually reveal in something as complex and feedback-sensitive as a human individual. And while I’d like as much as anyone for the world and everyone in it not to be destroyed (whether in one great cataclysm or a gradual tragic fade-out), I’ve come to terms with the fact that, as corny as it sounds, all we can do is our best, and we must do this in the utter absence of perfect knowledge regarding the limits of our individual or collective capacity.
Re. “the reason most people don’t agree with us is that they’re just not smart enough”...totally aside from the question of whether this sort of sentiment is liable to be offputting to a lot of people, I’ve very often wondered whether anyone who holds such a sentiment is at all worried about the consequences of an “Emperor’s New Clothes” effect.
What I mean by “Emperor’s New Clothes” effect is that, regardless of what a person’s actual views are on a given subject (or set of subjects), there’s really nothing stopping said person from just copying the favored vocabulary, language patterns, stated opinions, etc., of those they see as the cleverest/most prominent/most respectable members of a community they want to join and be accepted in.
E.g., in self-described “rationalist” communities, I’ve noted that lots of people involved (a) value intelligence (however they define it) highly, and (b) appear to enjoy being acknowledged as clever themselves. The easiest way to do this, of course, is to parrot others that the community of interest clearly thinks are the Smartest of the Smart. And in some situations I suspect the “parroting” can occur involuntarily, just as a result of reading a lot of the writing of someone you like, admire, or respect intellectually, even if you may not have any real, deep understanding of what you are saying.
So my question is...does anyone even care about this possibility? Or are “communities” largely in the business of collecting members and advocates who can talk the talk, regardless of what their brains are actually doing behind the scenes?