I’m a very confused person trying to become less confused. My history as a New Age mystic still colors everything I think even though I’m striving for rationality nowadays. Here’s my backstory if you’re interested.
MSRayne
Well, as I attempted to express in the original comment, I am not a Shaiva, but rather I had mystical experiences and things as a teen that led me to invent my own religion from scratch which has similarities with various other belief systems, and Kashmir Shaivism is one of them. For the most part however it’s just a kind of background element of my existence, part of my ontology, and not something I put much attention towards actively anymore. In practice I’m effectively an atheist physicalist like everyone else here. It’s just… there’s also something that lurks beneath. Or there used to be. I’ve gotten more disillusioned, more empty-souled and this-worldly as I’ve gotten older, and I don’t really know how to get back the way I used to feel. Probably psychedelics is the only way; meditation doesn’t do anything for me.
I think you’re wrong that Level 4 is rare. It describes everyday reality in the upper strata of any cult—the psychopathic leadership class who make up shit for everyone else to believe, or claim to believe for status, or etc, and compete for status among those followers. And there are a LOT of cults in modern society, including organizations not traditionally perceived as cults, such as political ideologies.
I’m sure you’ve already thought of this, and I know nothing about this area of biology, but isn’t it possible that the genes coding for intelligence more accurately code for the developmental trajectory of the brain, and that changing them would not in fact affect an adult brain?
I think people have “criticized” Minecraft for being unclear what the point is, and being more of a toy or sandbox than a “game.”
Myself included. I can’t play Minecraft; it’s far too open-ended, and makes me feel anxious and overwhelmed. “Wtf am I supposed to do??” I want a game to give me two or three choices max at every decision point, not an infinite vector space of possibilities through which I cannot sort or prioritize.
This post though is about one of my big obsessions: trying to figure out how to design a game (computer or tabletop or both) which makes cooperation fun. And I mean, fun in the way Diablo is fun. Addictive, power fantasy feeling, endless sequence of dopamine hits, sexy. The problem is that the only way to produce that Diablo-flow is to enable people to act automatically, reacting to signs and triggers with preprogrammed responses so that they can sink down into the animalistic part of their brain that hunts and stalks and pounces on things to tear them apart without thought or simulation.
But learning to negotiate with others is the exact opposite of that, and is the main reason we have the effort-intensive simulation system to begin with—so the problem of making a game that is simultaneously compelling on a primal level, and centers on conflict resolution rather than just conflict… is hard.
The only thing that seems to have the same kind of flow in it to me is dance and other group rituals (such as those in religion that hasn’t ossified to mere passionless false beliefs yet), which don’t really help with the whole “training negotiation” thing (though they do induce people to align with one another on an emotional level) and also cannot easily be turned into video games or TTRPGs.
I never actually said that all these notions are constructed and fake, only that some are. Clearly some aren’t. There are false positives and false negatives. I feel as if you’re arguing against a straw man here.
If I were Bob I’d have told her to fuck off long ago and stopped letting some random person berate me for being lazy just like my parents always have. This is basically guilt-tripping, not a beneficial way of approaching any kind of motivation, and it is absolutely guaranteed to produce pushback. But then, I’m probably not your target audience, am I?
Btw just to be clear, I think Said Achmiz explained my reaction better than I, who habitually post short reddit-tier responses, can. My specific issue is that Alice seems to be acting as if it’s any of her business what Bob does. It is not. Absolutely nobody likes being told they’re not being ethical enough. It’s why everyone hates vegans. As someone who doesn’t like experiencing such judgmental demands, I would have the kneejerk emotional reaction to want to become less of an EA just to spite her. (I would not of course act on this reaction, but I would start finding EA things to be in an ugh field because they remind me of the distress caused by this interaction.)
Holy heck I have been enlightened. And by contemplating nothingness too! Thanks for the clarification, it all makes sense now.
I really enjoy this sequence but there’s a sticking point that’s making me unable to continue until I figure it out. It seems to me rather obvious that… utility functions are not shift-invariant! If I denominate option A at 1 utilon and option B at 2 utilons, that means I am indifferent between a certain outcome of A and a 50% probability of B—and this is no longer the case if I shift my utility function even slightly. Ratios of utilities mean something concrete and are destroyed by translation. Since your entire argument seems to rest on that inexplicably not being the case, I can’t see how any of this is useful.
I understand all this logically, but my emotional brain asks, “Yeah, but why should I care about any of that? I want what I want. I don’t want to grow, or improve myself, or learn new perspectives, or bring others joy. I want to feel good all the time with minimal effort.”
When wireheading—real wireheading, not the creepy electrode in the brain sort that few people would actually accept—is presented to you, it is very hard to reject it, particularly if you have a background of trauma or neurodivergence that makes coping with “real life” difficult to begin with, which is why so many people with brains like mine end up as addicts. Actually, by some standards, I am an addict, just not of any physical substance.
And to be honest, as a risk-averse person, it’s hard for me to rationally argue for why I ought to interact with other people when AIs are better, except the people I already know, trust, and care about. Like, where exactly is my duty to “grow” (from other people’s perspective, by other people’s definitions, because they tell me I ought to do it) supposed to be coming from? The only thing that motivates me, sometimes, to try to do growth-and-self-improvement things is guilt. And I’m actually a pretty hard person to guilt into doing things.
That’s a temporary problem. Robot bodies will eventually be good enough. And I’ve been a virgin for nearly 26 years, I can wait a decade or two longer till there’s something worth downloading an AI companion into if need be.
Neither of these really describes what childhood is for. Both of them are inventions of the modern WEIRD society. I’d suggest you read “Anthropology of Childhood: Cherubs, Chattels, Changelings” for a wider view on the subject… it’s pretty bleak though. The very idea that there is such a thing as an optimal childhood parents ought to strive to provide their children… is also a modern, Western, extremely unusual idea, and throughout most of history, in most cultures, they were just… little creatures that would eventually be adults and till then either got in the way or were used for something.
The norm appears to be “benevolent neglect”, at best—that is, children are not (outside of our Western bubble of reality, as well as East Asia which independently invented some of the same norms) actively taught or guided towards anything; mostly they are ignored and they teach themselves everything they need to know by mimicking adults. People spend time with their children, but it’s rarely a goal explicitly striven for (the way it is for Western parents); it’s just a side effect of their existing at all.
To be honest, I look forward to AI partners. I have a hard time seeing the point of striving to have a “real” relationship with another person, given that no two people are really perfectly compatible, no one can give enough of their time and attention to really satisfy a neverending desire for connection, etc. I expect AIs to soon enough be better romantic companions—better companions in all ways—than humans are. Why shouldn’t I prefer them?
Great, apparently I’m in just the right place… I’m always alone and have few friends who might influence me to give up my wacky ideas! Wonderful.....
crickets
Those stories are surprisingly coherent and compelling. They were actually fun to read!
I’m not sure how useful the concept of boundary placement rebellion is, though. It certainly is a thing, but it’s also something basically everyone engages in. I pretty much constantly do it… though maybe that says more about me than anything...
“Thou strivest ever. Even in thy yielding, thou strivest to yield; and lo! thou yieldest not. Go thou into the outermost places, and subdue all things. Subdue thy fear and thy distrust. And then—YIELD.”—Aleister Crowley
I’m never really sure what there’s any point in saying. My main interests have nothing to do with AI alignment, which seems to be the primary thing people talk about here. And a lot of my thoughts require the already existing context of my previous thoughts. Honestly, it’s difficult for me to communicate what’s going on in my head to anyone.
No, it’s called “lying”. The text that he produces as a result of these social pressures does not reflect his actual thought processes. You can’t judge a belief on the basis of a bunch of ex post facto arguments people make up to rationalize it—the method by which they came to hold the belief is much more informative, and for those of us with very roundabout styles of thinking (such as myself) being forced into this self-censorship and modification of our thought patterns into something “coherent” and easy to read actually destroys all the evidence of how we actually came to the idea, and thus destroys much of your ability to effectively examine its validity!
I feel the same as Adrian and Cato. I am very much the opposite of a rigorous thinker—in fact, I am probably not capable of rigor—and I would like to be the person who spews loads of interesting off the wall ideas for others to parse through and expand upon those which are useful. But that kind of role doesn’t seem to exist here and I feel very intimidated even writing comments, much less actual posts—which is why I rarely do. The feeling that I have to put tremendous labor into making a Proper Essay full of citations and links to sequences and detailed arguments and so on—it’s just too much work and not worth the effort for something I don’t even know anyone will care about.
This makes me wonder if some proportion of “masculine” gay men are actually transwomen (of the early onset type) with autoandrophilia. I may even fit into that category myself. I didn’t care about masculinity and in fact found it somewhat abhorrent and not-me-ish until I started getting off to more masculine looking guys in porn. (When I first saw porn when I was 12 I mainly focused on twinks and wanted to look like them, and there’s still a part of me that feels that way, which wars with the part that wants to bulk up because masc dudes are also hot—and usually wins, because bulking is hard and I would rather read books.)
Of course, my natural femininity is not tremendous (I wasn’t flamboyant as a child and as far as I know never have been—I’ve always thought feminine-acting men were creepy—but I did flirt with identifying as nonbinary during my late teens, and used to have multiple female alters during the period where I thought I had multiple personalities), and most of my femininity is the result of misandry taught by the media and my mother (I believed for most of my childhood and early teens that masculinity is disgusting and bestial, and that only women can be powerful / noble, but later realized that like all other disgusting and bestial things, masculinity is sexy as fuck, which helped me get out of my misandry phase.)
Nowadays I think my gender identity is probably something like “true hermaphrodite / omega (as in the omegaverse fanfiction trope) male”, which unfortunately is not something that one can currently medically transition to, and I experience no dysphoria (and to be honest, the only reason I think it would be cool to have both male and female genitals is because it seems too asymmetric and unbalanced not to, and I’m very Libra [yes I know astrology isn’t real, but it’s still a helpful and / or fun language to describe personalities with]).
Well—actually, it’s possible I do experience dysphoria, but in which direction changes with my mood (I sometimes don’t feel masculine enough), and there’s an element of The Paraphilia Which Must Not Be Named [note: if you ask me, I will not name it, and I will neither confirm nor deny guesses, but you can probably figure it out based on what I’m not saying] which also interacts in weird ways with the whole thing, and overall I just find gender and sexuality stuff tiresome and confusing and sort of wish I didn’t have to deal with it.
Thanks for coming to my rambly asf TED talk.
I am not the best at writing thorough comments because I am more of a Redditor than a LessWronger, but I just want you to know that I read the entire post over the course of ~2.5 hours and I support you wholeheartedly and think you’re doing something very important. I’ve never been part of the rationalist “community” and don’t want to be (I am not a rationalist, I am a person who strives weakly for rationality, among many other strivings), particularly after reading all this, but I definitely expected better out of it than I’ve seen lately. But perhaps I shouldn’t; the few self-identified rationalists I’ve interacted with one on one have mostly seemed like… at best, very strange people to me. And Eliezer has always, honestly, struck me as a dangerous narcissist whose interest in truth is secondary to his interest in being the Glorious Hero. I don’t want to go to the effort of replying to specific things you said—and you don’t know who I am and probably won’t read this anyway—but yeah, just, I’m glad you said them.