The only thing chess club members have to do to participate is to organize or play in chess matches. The only thing computer security club members have to do to participate is (usually) to help organize or play computer hacking challenges. The only thing you have to do to get involved in the Christian Community is to go to church and maybe attend a few church functions.
AFAICT, the only obvious way to participate in and contribute to rationalist culture is to write insightful posts on LessWrong, in the same way that the only way to get involved with the SCPWiki is to write SCPs. But the bar for doing that in a prosocial and truthful way is now pretty high, and was always going to effect a power law, with a few very intelligent founding members contributing most of the canon. It’s not that they’re doing anything wrong (I love their content), it’s just naturally what happens.
Most of the problems I see on LessWrong lie downstream of this. Regular non-Google, non-finance software engineers face this dilemma of either staying silent and never getting to interact with the community, saying something that’s been said before, indulging in one of their biases, or unfairly criticizing existing works and members. For very unconscientious people this means completely throwing away existing guardrails and deontology because that’s the only way they can think to differentiate themselves from Eliezer and carve a niche.
I was able to get involved in rationality by going to in-person meetups. I suggest, if you’re feeling left-out, you do the same (or create in-person meetups yourself!).
Edit: There also exist various rationalist discords you could join. They’re usually fun, and don’t require you to make a post.
Oh, don’t be like that. It’s more like: I like talking to rationalists. The only time I do so is when I make posts and comments. So I feel a noticable urge to come up with mediocre posts and comments when I’d rather just have some regular community function to attend.
I’ll probably try to do a meetup soon now that I’m in LA.
I feel the same way. I like talking with people on here, but in almost every subject I have nothing substantive to contribute; I’m just a consumer.
I wish there were a broader, reddit-style aspect to this site for more ordinary posts. They don’t have to be about Kim Kardashian or anything, but just regular economics, the current bank runs, Bitcoin, lifestyle/fitness/nutrition stuff, interesting links. You know, minus the reddit toxicity and religious zealotry in every subreddit.
Maybe I’m wrong. Maybe having the majority of the sub dedicated to AI alignment really is the way to go. It’s just… I’m not smart enough, nor do I have the resources to meaningfully help on that front, and I suspect there are many like me in the IQ 130-145 range who absolutely love finally finding a community they can relate to, but don’t have the 160+ IQ to really break ground on alignment research.
Unless I’m selling us regular geniuses short, but I don’t think I am (sadly).
You know, you can contribute to alignment without contributing to alignment. Focus on the places you’re shocked everyone else is dropping the ball. “Hey wait, why so little emphasis on aligning the humans that make AI? Wouldn’t getting people to just slow the hell down and stop racing toward oblivion be helpful?” is one example of this, that would use an entirely different skillset (PR, social skills, etc) to work on. In my own case, I’m mainly interested in designing a system enabling mass human coordination and factored cognition, though I’m terrible at actually writing anything about the mountain of ideas in my head. This would indirectly speed up alignment by helping researchers think clearly, and also be great in many other ways. Think outside the “AI alignment directly and nothing else” box, and find something you can work on, with your skillset.
Regular non-Google, non-finance software engineers face this dilemma of either staying silent and never getting to interact with the community, saying something that’s been said before, indulging in one of their biases, or unfairly criticizing existing works and members.
I’m glad you point this out. I think it is both real and important. However, I don’t think it has to be that way! It’s always been sort of a pet peeve of mine. “Normal” people can participate in so many ways. Here is what comes to my mind right now but definitely isn’t exhaustive:
As Garrett Baker says, joining or creating in-person or online communities. There are tons!
I think the issue is that there’s not enough social proof for this sort of stuff. Not enough other people doing it. My theory is that too many other people writing insightful stuff makes it feel like the bar is set somewhere around there and thus it is taboo to start “lesser” conversations.
The only thing chess club members have to do to participate is to organize or play in chess matches. The only thing computer security club members have to do to participate is (usually) to help organize or play computer hacking challenges. The only thing you have to do to get involved in the Christian Community is to go to church and maybe attend a few church functions.
AFAICT, the only obvious way to participate in and contribute to rationalist culture is to write insightful posts on LessWrong, in the same way that the only way to get involved with the SCPWiki is to write SCPs. But the bar for doing that in a prosocial and truthful way is now pretty high, and was always going to effect a power law, with a few very intelligent founding members contributing most of the canon. It’s not that they’re doing anything wrong (I love their content), it’s just naturally what happens.
Most of the problems I see on LessWrong lie downstream of this. Regular non-Google, non-finance software engineers face this dilemma of either staying silent and never getting to interact with the community, saying something that’s been said before, indulging in one of their biases, or unfairly criticizing existing works and members. For very unconscientious people this means completely throwing away existing guardrails and deontology because that’s the only way they can think to differentiate themselves from Eliezer and carve a niche.
I was able to get involved in rationality by going to in-person meetups. I suggest, if you’re feeling left-out, you do the same (or create in-person meetups yourself!).
Edit: There also exist various rationalist discords you could join. They’re usually fun, and don’t require you to make a post.
Oh, don’t be like that. It’s more like: I like talking to rationalists. The only time I do so is when I make posts and comments. So I feel a noticable urge to come up with mediocre posts and comments when I’d rather just have some regular community function to attend.
I’ll probably try to do a meetup soon now that I’m in LA.
I feel the same way. I like talking with people on here, but in almost every subject I have nothing substantive to contribute; I’m just a consumer.
I wish there were a broader, reddit-style aspect to this site for more ordinary posts. They don’t have to be about Kim Kardashian or anything, but just regular economics, the current bank runs, Bitcoin, lifestyle/fitness/nutrition stuff, interesting links. You know, minus the reddit toxicity and religious zealotry in every subreddit.
Maybe I’m wrong. Maybe having the majority of the sub dedicated to AI alignment really is the way to go. It’s just… I’m not smart enough, nor do I have the resources to meaningfully help on that front, and I suspect there are many like me in the IQ 130-145 range who absolutely love finally finding a community they can relate to, but don’t have the 160+ IQ to really break ground on alignment research.
Unless I’m selling us regular geniuses short, but I don’t think I am (sadly).
That’s what the r/slatestarcodex subreddit is for.
You know, you can contribute to alignment without contributing to alignment. Focus on the places you’re shocked everyone else is dropping the ball. “Hey wait, why so little emphasis on aligning the humans that make AI? Wouldn’t getting people to just slow the hell down and stop racing toward oblivion be helpful?” is one example of this, that would use an entirely different skillset (PR, social skills, etc) to work on. In my own case, I’m mainly interested in designing a system enabling mass human coordination and factored cognition, though I’m terrible at actually writing anything about the mountain of ideas in my head. This would indirectly speed up alignment by helping researchers think clearly, and also be great in many other ways. Think outside the “AI alignment directly and nothing else” box, and find something you can work on, with your skillset.
I’m glad you point this out. I think it is both real and important. However, I don’t think it has to be that way! It’s always been sort of a pet peeve of mine. “Normal” people can participate in so many ways. Here is what comes to my mind right now but definitely isn’t exhaustive:
Contributing examples, analogies and lingo
Non-expert explanation
Asking questions (example)
Starting discussions about things
As Garrett Baker says, joining or creating in-person or online communities. There are tons!
I think the issue is that there’s not enough social proof for this sort of stuff. Not enough other people doing it. My theory is that too many other people writing insightful stuff makes it feel like the bar is set somewhere around there and thus it is taboo to start “lesser” conversations.