Right, alignment advocates really underestimate the degree to which talking about sci-fi sounding tech is a sticking point for people
sullyj3
Is there any relation to this paper from 1988?
https://www.semanticscholar.org/paper/Self-Organizing-Neural-Networks-for-the-Problem-Tenorio-Lee/fb0e7ef91ccb6242a8f70214d18668b34ef40dfd
I think it’s reasonable to take the position that there’s no violation of consent, but it’s unreasonable to then socially censure someone for participating in the wrong way.
your initial comment entirely failed to convey it
Sure, I don’t disagree.
This is just such a bizarre tack to take. You can go down the “toughen up” route if you want to, but it’s then not looking good for the people who have strong emotional reactions to people not playing along with their little game. I’m really not sure what point you’re trying to make here. It seems like this is a fully general argument for treating people however the hell you want. After all, it’s not worse than the vagaries of life, right? Is this really the argument you’re going with, that if something is a good simulation of life, we should just unilaterally inflict it on people?
I want to be clear that it’s not having rituals and taking them seriously that I object to. It’s sending the keys to people who may or may not care about that ritual, and then castigating them for not playing by rules that you’ve assigned them. They didn’t ask for this.
In my opinion Chris Leong showed incredible patience in writing a thoughtful post in the face of people being upset at him for doing the wrong thing in a game he didn’t ask to be involved in. If I’d been in his position I would have told the people who were upset at me that this was their own problem and they could quite frankly fuck off.
Nobody has any right to involve other people in a game like that without consulting them, given the emotional investment in this that people seem to have.
You’re right, I haven’t been active in a long time. I’m mostly a lurker on this site. That’s always been partly the case, but as I mentioned, it was the last of my willingness to identify as a LWer that was burnt, not the entire thing. I was already hanging by a thread.
My last comment was a while ago, but my first comment is from 8 years ago. I’ve been a Lesswronger for a long time. HPMOR and the sequences were pretty profound influences on my development. I bought the sequence compendiums. I still go to local LW meetups regularly, because I have a lot of friends there.
So, you can dismiss me as some random who has just come here to hate if you want to, I guess, but I don’t think that makes much sense. Definitely the fact that I was a bit obnoxious with my criticism probably makes it tempting to. You can tell I’m here in bad faith from all the downvotes, right?
I think the audience seeing this comment is heavily self selected to care about the Petrov day celebration and think it’s good and important. These present core LWers risk severely underestimating how off-putting this stuff is. How many people would be interested in participating in this community, constructively, if the vibes were a little less weird. These people, unlike me, mostly don’t care enough to rock up and criticize.
The reason I was rude was because I am frustrated at feeling like I have to abandon my identification as an LW rat, because I just don’t want to be associated with it anymore. I got so much value from less wrong, and it feels so unnecessary.
For what it’s worth, this game and the past reactions to losing it have burnt the last of my willingness to identify as a LW rationalist. Calling a website going down for a bit “destruction of real value” is technically true, but connotationally just so over the top. A website going down is just not that big a deal. I’m sorry, but it’s not. Go outside or something. It will make you feel good, I promise.
Then getting upset at other people when they don’t a take strange ritual as seriously as you do? As you’ve decided to, seemingly arbitrarily? When you’ve deliberately given them the means to upset you? It’s tantamount to emotional blackmail. It’s just obnoxious and strange behaviour.
As a trust building exercise, this reduces my confidence in the average lesswronger’s ability to have perspective about how important things are, and to be responsible for their own emotional wellbeing.
This feels elitist, ubermenchy, and a little masturbatory. I can’t really tell what point, if any, you’re trying to make. I don’t disagree that many of the traits you list are admirable, but noticing that isn’t particularly novel or insightful. Your conceptual framework seems like little more than thinly veiled justification for finding reasons to look down on others. Calling people more or less “human” fairly viscerally evokes past justifications for subjugating races and treating them as property.
We’re supposed to learn agency from Fight Club? That frankly seems like terrible advice.
The truth of probability theory itself depends on non-contradiction, so I don’t really think that probability is a valid framework for reasoning about the truth of fundamental logic, because if logic is suspect probability itself becomes suspect.
Cudos to Andreas Giger for noticing what most of the commentators seemed to miss: “How can utility be maximised when there is no maximum utility? The answer of course is that it can’t.” This is incredibly close to stating that perfect rationality doesn’t exist, but it wasn’t explicitly stated, only implied.
I think the key is infinite vs finite universes. Any conceivable finite universe can be arranged in a finite number of states, one, or perhaps several of which, could be assigned maximum utility. You can’t do this in universes involving infinity. So if you want perfect rationality, you need to reduce your infinite universe to just the stuff you care about. This is doable in some universes, but not in the ones you posit.
In our universe, we can shave off the infinity, since we presumably only care about our light cone.
Unfortunately the only opinions you’re gonna get on what should be instituted as a norm are subjective ones. So… Take the average? What if not everyone thinks that’s a good idea? Etc, etc, it’s basically the same problem as all of ethics.
Drawing that distinction between normative and subjective offensiveness still seems useful.
Just encountered an interesting one:
Eradication of the Parasitoid Wasp is genocide!
Perhaps a solution could be to create stronger social ties; video chat? Could be good for asking each other for help and maybe progress reports for accountability and positive reinforcement.
As an interested denizen of 2015, It might be cool to make this a regular (say, monthly?) thread, with a tag for the archive.
Oh, like Achilles and the tortoise. Thanks, this comment clarified things a bit.
Doesn’t this add “the axioms of probability theory” ie “logic works” ie “the universe runs on math” to our list of articles of faith?
Edit: After further reading, it seems like this is entailed by the “Large ordinal” thing. I googled well orderedness, encountered the wikipedia article, and promptly shat a brick.
What sequence of maths do I need to study to get from Calculus I to set theory and what the hell well orderedness means?
I feel like it would’ve been even better if no one ended up explaining to Capla.
Fair point, and one worth making in the course of talking about sci-fi sounding things! I’m not asking anyone to represent their beliefs dishonestly, but rather introduce them gently. I’m personally not an expert, but I’m not convinced of the viability of nanotech, so if it’s not necessary (rather it’s sufficient) to the argument, it seems prudent to stick to more clearly plausible pathways to takeover as demonstrations of sufficiency, while still maintaining that weirder sounding stuff is something one ought to expect when dealing with something much smarter than you.