I was able to get involved in rationality by going to in-person meetups. I suggest, if you’re feeling left-out, you do the same (or create in-person meetups yourself!).
Edit: There also exist various rationalist discords you could join. They’re usually fun, and don’t require you to make a post.
Oh, don’t be like that. It’s more like: I like talking to rationalists. The only time I do so is when I make posts and comments. So I feel a noticable urge to come up with mediocre posts and comments when I’d rather just have some regular community function to attend.
I’ll probably try to do a meetup soon now that I’m in LA.
I feel the same way. I like talking with people on here, but in almost every subject I have nothing substantive to contribute; I’m just a consumer.
I wish there were a broader, reddit-style aspect to this site for more ordinary posts. They don’t have to be about Kim Kardashian or anything, but just regular economics, the current bank runs, Bitcoin, lifestyle/fitness/nutrition stuff, interesting links. You know, minus the reddit toxicity and religious zealotry in every subreddit.
Maybe I’m wrong. Maybe having the majority of the sub dedicated to AI alignment really is the way to go. It’s just… I’m not smart enough, nor do I have the resources to meaningfully help on that front, and I suspect there are many like me in the IQ 130-145 range who absolutely love finally finding a community they can relate to, but don’t have the 160+ IQ to really break ground on alignment research.
Unless I’m selling us regular geniuses short, but I don’t think I am (sadly).
You know, you can contribute to alignment without contributing to alignment. Focus on the places you’re shocked everyone else is dropping the ball. “Hey wait, why so little emphasis on aligning the humans that make AI? Wouldn’t getting people to just slow the hell down and stop racing toward oblivion be helpful?” is one example of this, that would use an entirely different skillset (PR, social skills, etc) to work on. In my own case, I’m mainly interested in designing a system enabling mass human coordination and factored cognition, though I’m terrible at actually writing anything about the mountain of ideas in my head. This would indirectly speed up alignment by helping researchers think clearly, and also be great in many other ways. Think outside the “AI alignment directly and nothing else” box, and find something you can work on, with your skillset.
I was able to get involved in rationality by going to in-person meetups. I suggest, if you’re feeling left-out, you do the same (or create in-person meetups yourself!).
Edit: There also exist various rationalist discords you could join. They’re usually fun, and don’t require you to make a post.
Oh, don’t be like that. It’s more like: I like talking to rationalists. The only time I do so is when I make posts and comments. So I feel a noticable urge to come up with mediocre posts and comments when I’d rather just have some regular community function to attend.
I’ll probably try to do a meetup soon now that I’m in LA.
I feel the same way. I like talking with people on here, but in almost every subject I have nothing substantive to contribute; I’m just a consumer.
I wish there were a broader, reddit-style aspect to this site for more ordinary posts. They don’t have to be about Kim Kardashian or anything, but just regular economics, the current bank runs, Bitcoin, lifestyle/fitness/nutrition stuff, interesting links. You know, minus the reddit toxicity and religious zealotry in every subreddit.
Maybe I’m wrong. Maybe having the majority of the sub dedicated to AI alignment really is the way to go. It’s just… I’m not smart enough, nor do I have the resources to meaningfully help on that front, and I suspect there are many like me in the IQ 130-145 range who absolutely love finally finding a community they can relate to, but don’t have the 160+ IQ to really break ground on alignment research.
Unless I’m selling us regular geniuses short, but I don’t think I am (sadly).
That’s what the r/slatestarcodex subreddit is for.
You know, you can contribute to alignment without contributing to alignment. Focus on the places you’re shocked everyone else is dropping the ball. “Hey wait, why so little emphasis on aligning the humans that make AI? Wouldn’t getting people to just slow the hell down and stop racing toward oblivion be helpful?” is one example of this, that would use an entirely different skillset (PR, social skills, etc) to work on. In my own case, I’m mainly interested in designing a system enabling mass human coordination and factored cognition, though I’m terrible at actually writing anything about the mountain of ideas in my head. This would indirectly speed up alignment by helping researchers think clearly, and also be great in many other ways. Think outside the “AI alignment directly and nothing else” box, and find something you can work on, with your skillset.