Epiphany, I’d love to see you actually take a stab at an introductory paragraph for the About page that you think would work well.
I don’t think anyone has the budget to hire a professional web marketer. Additionally, the crowd we’re “marketing” to is a bit atypical, so it’s not clear how well standard principles would transfer. Less Wrong is full of long-form essays on rationality; if you don’t like reading, you’re probably not going to like the site.
I think it would be great for more people to produce variations and suggestions for the LW newcomer pages. Even if your variation isn’t especially good overall, it might have a few good ideas we can steal for the real thing. There’s definitely high potential upside in having more person-hours thrown at this. And if enough variations are produced, maybe we can get matt to implement A/B testing and test them against each other.
I would totally love to take a stab at the About page. I want to get more information from these guys first. I want to know what sort of people they want to attract. I have a gist of the culture here, but my gist of it is new. I want to make sure that I’m targeting their audience. Really, we need to do that first. That’s so important. Whether we do a good job of marketing this site or not, we could one day find ourselves so overrun with new users such that the old users are out numbered. That could happen so fast that the entire place seems to change in just a few months—that’s how the exponential curve at another forum was experienced.
We want to do whatever we can so that if we hit a curve like that, existing members are happy with the influx and the influence new people have on their culture. Maybe a new thread should be made “Who do you want to meet? Choose a target audience.”—Do you want to do this one or should I?
I am asking because I don’t want to step on your toes—and there might be questions about the culture that only an older member would think to ask. Not sure.
I would like to see more diversity. Not just in terms of demographics (though that too), but in terms of fields and specialties. There is nothing inherent in rationality that should limit it to computer/ math/ physics/ philosophy types. There are highly intelligent people in other fields also, and I feel like people from other disciplines could introduce an influx of new ideas.
Also, I think recruiting new members is a positive goal, and fully support it. I would like to see the community grow.
There is nothing inherent in rationality that should limit it to computer/ math/ physics/ philosophy types.
Actually, I’m pretty sure there is.
It really really helps to be comfortable with math to do rationality, there is no way around it. The kind of people who have both the capability and interest to master things like programming or probability theory or Quantum mechanics will tend to be what you call “computer/ math/ physics/ philosophy types”.
There are highly intelligent people in other fields also, and I feel like people from other disciplines could introduce an influx of new ideas.
But again how do you know we don’t already efficiently mine smart people from other disciplines? Surely you don’t expect smart people to be evenly distributed between professions? Consider the 2011 census:
In order of frequency, we include 366 computer scientists (32.6%), 174 people in the hard sciences (16%) 80 people in finance (7.3%), 63 people in the social sciences (5.8%), 43 people involved in AI (3.9%), 39 philosophers (3.6%), 15 mathematicians (1.5%), 14 statisticians (1.3%), 15 people involved in law (1.5%) and 5 people in medicine (.5%).
Computer scientists probably are overrepresented and there are some fields we could recruit from more. For example I think we would benefit greatly from more economists and biologists, since more and more LWers are unfamiliar with some relevant basics in those fields as we’ve had less emphasis on those questions since the Overcoming Bias days.
We can reduce the computer science share but I don’t think we can reduce the “computer/ math/ physics/ philosophy types” share without lower standards, either with regards to intelligence or by reducing the focus on human rationality.
EY keeps emphasizing how crucial the QM sequnece is to the other material, so I take his word at it.
You guessed the teacher’s password! Now, can you recite (and criticize) his reasons?
I do think probability theory and a lot of other math is a must.
Why? There is very little math in the Sequences, and almost none beyond the American grade 10 equivalent. Most is simple arithmetic and an occasional simple equation.
How clever of you to share another one! A gold star for both of us! Can you now explain why trusting a sound rationalist’s or specialist’s conclusions based on their authority if one hasn’t the time to investigate them oneself is wrong from a Bayesian perspective?
Now, can you recite (and criticize) his reasons?
I think it mattes for his arguments about us being the pattern in our brain rather than the meat of our brain. But again I haven’t read all of the QM sequence, I don’t recall claiming I was a particularly good rationalist, all I claimed was that: “It really really helps to be comfortable with math to do rationality, there is no way around it. ”
You don’t need to be a great rationalist to see that.
Why? There is very little math in the Sequences, and almost none beyond the American grade 10 equivalent. Most is simple arithmetic and an occasional simple equation.
Please tell me how many Americans with 10 grade equivalent can read and understand any of the statistics used in papers LWers cite. How much of a gwern do they have in them? Those who can’t and don’t read the studies cited are taking EY’s or gwern’s or lukeprogs conclusions on various topics as much on authority as I am the relevance of QM to rationality.
all I claimed was that: “It really really helps to be comfortable with math to do rationality, there is no way around it.”
It helps, I don’t disagree. Especially if you have to calculate some Bayesian thingies. But that’s an advanced level. In a hypothetical RationalU it would probably correspond to the third year.
Ceteris paribus yes I can agree diverse contributors may be beneficial to our mission. Especially value diversity, since differences in desired conclusions may lead to motivated cognition being called out more. But I think when most people speak of diversity they don’t have that kind of diversity in mind. So sticking to the other kinds, I have to note that I haven’t seen a single data driven argument for this why this would be so in actual humans. It is simply assumed or asserted. I’m pretty sure this is so because it happens to be a sacred value of our society. While cyberspace is clearly different from meatspace, studies done in meatspace seem to show diversity has negative effects that are seldom talked about.
But leaving aside such undesired consequences let me just point out that the ceteris paribus in my first paragraph is also unlikely. Very small differences can result in almost complete homogenization. Not only has it proven difficult, expensive and perhaps mostly ineffective to do this in meatspace, you would actually need to keep a delicate balancing act to keep a certain heterogenus mix together if your goal isn’t to simply swap one group for another. I think we already spend insufficient attention spent on gardening and this would be just one more difficult task to add to the list.
Once more I ask why no one ever subjects such proposals to explicit cost-benefit analysis?
Hmm. That’s an interesting idea—and you got a lot of points for that. This is a more complicated task than simply ensuring that users who are interested in developing their thinking abilities get a chance to see how awesome LessWrong is when they come to the front page. I will think on it.
I want to know what sort of people they want to attract.
My answers: smart people, people who already have a strong technical background in something, people who already have a strong interest in rationality.
Whether we do a good job of marketing this site or not, we could one day find ourselves so overrun with new users such that the old users are out numbered. That could happen so fast that the entire place seems to change in just a few months—that’s how the exponential curve at another forum was experienced.
This sounds like a good problem to have to me. If we later decide we don’t like it, it should be easy to just change the about page back to the way it was before.
There seem to be a lot of lurkers on LW, so a surge of new lurkers might be more likely than a surge of new contributors, with lurkers gradually moving to contributor status.
Maybe a new thread should be made “Who do you want to meet? Choose a target audience.”—Do you want to do this one or should I?
IMO it isn’t worth a new post, for the time being at least.
I am asking because I don’t want to step on your toes—and there might be questions about the culture that only an older member would think to ask. Not sure.
I think the principle of be bold applies to this sort of collaborative writing process. In other words, throw something at us and we’ll give you feedback on it.
I second daenerys and John’s user requirements, but will add a few more restrictive ones of my own. I want people who are good at taking and giving constructive criticism. I also want people who are willing to read a fairly large back log of posts (at least a decent chunk of the sequences), and read more if people give them links indicating that we’ve already covered whatever it is they are talking about.
Epiphany, I’d love to see you actually take a stab at an introductory paragraph for the About page that you think would work well.
I don’t think anyone has the budget to hire a professional web marketer. Additionally, the crowd we’re “marketing” to is a bit atypical, so it’s not clear how well standard principles would transfer. Less Wrong is full of long-form essays on rationality; if you don’t like reading, you’re probably not going to like the site.
I think it would be great for more people to produce variations and suggestions for the LW newcomer pages. Even if your variation isn’t especially good overall, it might have a few good ideas we can steal for the real thing. There’s definitely high potential upside in having more person-hours thrown at this. And if enough variations are produced, maybe we can get matt to implement A/B testing and test them against each other.
I would totally love to take a stab at the About page. I want to get more information from these guys first. I want to know what sort of people they want to attract. I have a gist of the culture here, but my gist of it is new. I want to make sure that I’m targeting their audience. Really, we need to do that first. That’s so important. Whether we do a good job of marketing this site or not, we could one day find ourselves so overrun with new users such that the old users are out numbered. That could happen so fast that the entire place seems to change in just a few months—that’s how the exponential curve at another forum was experienced.
We want to do whatever we can so that if we hit a curve like that, existing members are happy with the influx and the influence new people have on their culture. Maybe a new thread should be made “Who do you want to meet? Choose a target audience.”—Do you want to do this one or should I?
I am asking because I don’t want to step on your toes—and there might be questions about the culture that only an older member would think to ask. Not sure.
I would like to see more diversity. Not just in terms of demographics (though that too), but in terms of fields and specialties. There is nothing inherent in rationality that should limit it to computer/ math/ physics/ philosophy types. There are highly intelligent people in other fields also, and I feel like people from other disciplines could introduce an influx of new ideas.
Also, I think recruiting new members is a positive goal, and fully support it. I would like to see the community grow.
Actually, I’m pretty sure there is.
It really really helps to be comfortable with math to do rationality, there is no way around it. The kind of people who have both the capability and interest to master things like programming or probability theory or Quantum mechanics will tend to be what you call “computer/ math/ physics/ philosophy types”.
But again how do you know we don’t already efficiently mine smart people from other disciplines? Surely you don’t expect smart people to be evenly distributed between professions? Consider the 2011 census:
Computer scientists probably are overrepresented and there are some fields we could recruit from more. For example I think we would benefit greatly from more economists and biologists, since more and more LWers are unfamiliar with some relevant basics in those fields as we’ve had less emphasis on those questions since the Overcoming Bias days.
We can reduce the computer science share but I don’t think we can reduce the “computer/ math/ physics/ philosophy types” share without lower standards, either with regards to intelligence or by reducing the focus on human rationality.
Why do you think you need to be able to master quantum mechanics in order to “do rationality” ?
I don’t. However EY keeps emphasizing how crucial the QM sequence is to the other material, so I take his word at it.
I do think probability theory and a lot of other math is a must.
You guessed the teacher’s password! Now, can you recite (and criticize) his reasons?
Why? There is very little math in the Sequences, and almost none beyond the American grade 10 equivalent. Most is simple arithmetic and an occasional simple equation.
How clever of you to share another one! A gold star for both of us! Can you now explain why trusting a sound rationalist’s or specialist’s conclusions based on their authority if one hasn’t the time to investigate them oneself is wrong from a Bayesian perspective?
I think it mattes for his arguments about us being the pattern in our brain rather than the meat of our brain. But again I haven’t read all of the QM sequence, I don’t recall claiming I was a particularly good rationalist, all I claimed was that: “It really really helps to be comfortable with math to do rationality, there is no way around it. ”
You don’t need to be a great rationalist to see that.
Please tell me how many Americans with 10 grade equivalent can read and understand any of the statistics used in papers LWers cite. How much of a gwern do they have in them? Those who can’t and don’t read the studies cited are taking EY’s or gwern’s or lukeprogs conclusions on various topics as much on authority as I am the relevance of QM to rationality.
It helps, I don’t disagree. Especially if you have to calculate some Bayesian thingies. But that’s an advanced level. In a hypothetical RationalU it would probably correspond to the third year.
To take a stab at that applause light.
Ceteris paribus yes I can agree diverse contributors may be beneficial to our mission. Especially value diversity, since differences in desired conclusions may lead to motivated cognition being called out more. But I think when most people speak of diversity they don’t have that kind of diversity in mind. So sticking to the other kinds, I have to note that I haven’t seen a single data driven argument for this why this would be so in actual humans. It is simply assumed or asserted. I’m pretty sure this is so because it happens to be a sacred value of our society. While cyberspace is clearly different from meatspace, studies done in meatspace seem to show diversity has negative effects that are seldom talked about.
But leaving aside such undesired consequences let me just point out that the ceteris paribus in my first paragraph is also unlikely. Very small differences can result in almost complete homogenization. Not only has it proven difficult, expensive and perhaps mostly ineffective to do this in meatspace, you would actually need to keep a delicate balancing act to keep a certain heterogenus mix together if your goal isn’t to simply swap one group for another. I think we already spend insufficient attention spent on gardening and this would be just one more difficult task to add to the list.
Once more I ask why no one ever subjects such proposals to explicit cost-benefit analysis?
Hmm. That’s an interesting idea—and you got a lot of points for that. This is a more complicated task than simply ensuring that users who are interested in developing their thinking abilities get a chance to see how awesome LessWrong is when they come to the front page. I will think on it.
My answers: smart people, people who already have a strong technical background in something, people who already have a strong interest in rationality.
This sounds like a good problem to have to me. If we later decide we don’t like it, it should be easy to just change the about page back to the way it was before.
There seem to be a lot of lurkers on LW, so a surge of new lurkers might be more likely than a surge of new contributors, with lurkers gradually moving to contributor status.
IMO it isn’t worth a new post, for the time being at least.
I think the principle of be bold applies to this sort of collaborative writing process. In other words, throw something at us and we’ll give you feedback on it.
I second daenerys and John’s user requirements, but will add a few more restrictive ones of my own. I want people who are good at taking and giving constructive criticism. I also want people who are willing to read a fairly large back log of posts (at least a decent chunk of the sequences), and read more if people give them links indicating that we’ve already covered whatever it is they are talking about.