If it were me, I’d split your list after reductionism into a separate ebook. Everything that’s controversial or hackles-raising is in the later sequences. A (shorter) book consisting solely of the sequences on cognitive biases, rationalism, and reductionism could be much more a piece of content somebody without previous rationalist intentions can pick up and take something valuable away from. The later sequences have their merits, but they are absolutely counterproductive to raising the sanity waterline in this case. They’ll label your book as kooky and weird, and they don’t, in themselves, improve their readers enough to justify the expense. People interested in the other stuff can get the companion volume.
You could label the pared down volume something self helpey like ‘Thinking Better: The Righter, Smarter You.” For goodness sake, don’t have the word ‘sequences’ in the title. That doesn’t mean anything to anyone not already from LW, and it won’t help people figure out what it’s about.
EDIT: Other title suggestions—really just throwing stuff at the wall here
Rationality: Art and Practice
The Rational You
The Art of Human Rationality
Black Belt Bayesian: Building a Better Brain
The Science of Winning: Human Rationality and You
Science of Winning: The Art and Practice of Human Rationality (I quite like this one)
Oh, and somebody get Yudkowsky an editor. I love the sequences, but they aren’t exactly short and to the point. Frankly, they ramble. Which is fine if you’re just trying to get your thoughts out there, but people don’t finish the majority of the books they pick up. You need something that’s going to be snappy, interesting, and cater to a more typical attention span. Something maybe half the length we’re looking at now. The more of it they get through, the more good you’re doing.
EDIT:
Oh! And the whole thing needs a full jargon palette-swap. There’s a lot of LW-specific jargon that isn’t helpful. In many cases, there’s existing academic jargon that can take the place of the phrases Yudkowky uses. Aside from lending the whole thing a superficial-but-useful veneer of credibility, it’ll make the academics happy, and make them less likely to make snide comments about your book in public fora. If you guys aren’t already planning on a POD demand run, you really should. Ebooks are wonderful, but the bulk of the population is still humping dead trees around. An audiobook or podcast might be useful as well.
For most part the sequences define said jargon, rather than using it.
First: Ahahahaha. No. The sequences use jargon. They use a lot of it. Approximately every single hyperlink in the Sequences is there to define jargon. This, by the way, works wonderfully when you have hyperlinks to remind you what the heck all the jargon means. Which leads me to #2...
Second: Print books, lacking hyperlinks, will need a glossary
Third: You are not allowed to use the chapter index as said glossary
I think what this person is saying is “We already have words for a lot of these things so Eliezer and the other people who wrote the sequences are making up their own words for no reason.”
In many cases, there’s existing academic jargon that can take the place of the phrases Yudkowky uses.
The sequences would get more hits from search engines by using existing terms, LWers and other flavors of rationalists would be able to communicate more easily, and people would have to remember less terms overall if he just uses existing terms.
(Edit: added “and other people” to above sentence about making up unnecessary words for the sequences)
“Luminosity” is best glossed as “self-awareness”, not “metacognition”. Also, Eliezer didn’t make it up (and I didn’t make it up from scratch), so bringing it up under the grandparent is peculiar.
“Luminosity” is best glossed as “self-awareness”, not “metacognition”
Since metacognition is thinking about your thoughts, and self-awareness means introspecting, which means thinking about your thoughts (or self-awareness can mean being aware that you have a separate personality, a self, which isn’t how you use it), I would say these could be synonymous. To test whether my perception here is wrong, I went to wikipedia, Google, and the dictionary to see how they used those words. Here’s what I discovered:
Both the internet and you are using self-awareness in a broader way to encompass emotions and states of mind. Metacognition is more narrow and is something I personally use more frequently when I’m referring to re-engineering my thought processes. I think the reason I initially selected “meta cognition” as a term to suggest is because the sequences are, to me, an invitation to think about thinking and re-engineer one’s thinking processes, so I was interpreting your luminosity concept within that context. Also, the way that I introspect about my feelings and experiences, I’m pretty focused on getting down to the thoughts behind everything and re-engineering them. The way I experience self-awareness, self-awareness and metacognition are inseparable and may as well be synonymous, but I acknowledge that other people may do it differently.
If your focus is more on the emotional / state of mind type aspects (I have not read all of your luminosity articles to be able to see the patterns in how you use it) self-awareness is probably a closer synonym for luminosity.
I also discovered that you have one of the top five Google results for luminosity. Good job. However, I think you’re more likely to show up when people are looking for light bulbs or similar than when they are looking for self-improvement materials, so in my view it was not the most optimal term to SEO.
Eliezer didn’t make it up (and I didn’t make it up from scratch
I see that, but I did not say that Eliezer came up with that specific example.
so bringing it up under the grandparent is peculiar
Because luminosity is a term used in the sequences, it’s relevant. Because it’s an example of a word that’s synonymous with a much more common word:
Luminosity, as I’ll use the term, is self-awareness.
It is definitely relevant as an example of uneccessary jargon in the sequences.
It would not support the point that Eliezer is coming up with unneccessary jargon, but that’s not my point, my real point is that the sequences have unneccessary jargon. That is what’s important here.
Since the sequences would get more hits from search engines by using existing terms, LWers and other flavors of rationalists would be able to communicate more easily, and people would have to remember less terms overall if existing terms were used, do you think it would be of greater benefit to use “self-awareness” instead?
Note: For SEO reasons it may be safer to add the term self-awareness to the title rather than replace luminosity. (When you change your title entirely, it may look like search engine spamming and get you spam penalties. I will check whether this specific technique worked if you’re interested.)
Since metacognition is thinking about your thoughts, and self-awareness means introspecting, which means thinking about your thoughts (or self-awareness can mean being aware that you have a separate personality, a self, which isn’t how you use it), I would say these could be synonymous.
One could also say that ‘meta-cognition’ allows one to arrive at self-awareness.
I dunno, I think it might go the other way around actually. As tempting as it is to continue this, I’m becoming aware of the fact that this has resulted in a bunch of people talking about wording and I am not sure whether there’s a point in discussing this further. I do want to discuss my ideas to slow Moore’s Law though.
For SEO reasons it may be safer to add the term self-awareness to the title rather than replace luminosity. (When you change your title entirely, it may look like search engine spamming and get you spam penalties.)
Merely modifying the title may also look like keyword stuffing.
I work with an SEO professionally. I was advised to do this by an SEO on another website. If she, (or anyone doing term replacements on the sequences) is interested, I will do the extra step of checking how well this tweak worked on the other site.
I’m not sure whether what looks like rambling is actually an effective method of easing people into the ideas so that the ideas are easier to accept, rather than just being inefficient.
In general, when something can be either tremendously clever, or a bit foolish, the prior tends to the latter. Even with someone who’s generally a pretty smart cookie. You could run the experiment, but I’m willing to bet on the outcome now.
It’s important to remember that it isn’t particularly useful for this book to be The Sequences. The Sequences are The Sequences, and the book can direct people to them. What would be more useful would be a condensed, rapid introduction to the field that tries to maximize insight-per-byte. Not something that’s a definitive work on rationality, but something that people can crank through in a day or two, rave about to their friends, and come away with a better idea of what rational thinking looks like. It’d also serve as a less formidable introduction for those who are very interested, to the broader pool of work on the subject, including the Sequences. Dollar for sanity-waterline dollar, that’s a very heavily leveraged position.
Actually, if CFAR isn’t going to write that book, I will.
I’m currently writing a summary of each sequence as I read them. I am doing this because it helps me to remember what I read. What is going to result from my doing this is a Cliff’s notes version of the sequences.
If you were going to do something similar anyway, I might as well just post these notes when I am done to save you the work. Would that serve the purpose you were thinking of? Or is your idea significantly different?
Regarding the jargon, I agree with wedrifid that LW-specific jargon is actually being defined as the sequences, and from what I’ve heard and experienced this is extremely helpful in setting down a common language for us to discuss these matters.
However, there is some jargon that could and probably should be done away with: the computer science stuff. Not all sequences/articles have it, but when it’s there it’s usually several levels of inference away from laypeople. The CS/programming examples, comparisons and metaphors are fun for someone like me, but it’s an accepted matter among IT people that things like the XKCD comic on a random function that always returns 4 will not help get the point across to non-IT people.
I’m sure that has been mentioned before, but it’s worth making sure that it’s looked over and that while doing it you remember that when writing educative material, most people severely overshoot the level that they’re aiming for, and end up writing a text that’s perfect for undergrads when they were targeting a middle school audience or somesuch.
Personally, I’d leave in most of the random intercultural references (like the anime references, for instance) since I suspect they’d still reach a good portion of the audience and wouldn’t have negative impact, but that’d be up for discussion. This also gives me an idea, but I’ll make a separate comment for it.
However, there is some jargon that could and probably should be done away with: the computer science stuff.
On the one hand, I agree with you. If people can’t understand, then that’s bad. On the other hand, touches like those give the sequences personality, and that personality may be part of what makes them popular.
Usually, though, there’s a way to phrase IT descriptions such that everyone can understand. I do this for my boss all the time. Maybe giving it a high-tech “personality” and making it comprehensible are not mutually exclusive.
The IT culture stuff is good, what I think wouldn’t pass is specific IT vocabulary or concepts that don’t get introduced within the sequences and that is assumed to be understood.
I seem to recall a reference somewhere of object-level vs class-level distinctions, and someone who’s never heard about OOP would have no idea that we’re basically talking about the programming equivalent of specific emails vs email templates (or “an email”, or whatever helps make them understand, but I’ve found the specifc email vs template example sufficient as a first step for most people I’ve had to explain this to).
Yup, this is planned. It may be that SI publishes the full Sequences thing, and CFAR publishes the cut-down version (with a new introduction by Eliezer, or something).
I agree completely with this comment (assuming that the ebook is aimed at people who aren’t already familiar with LW.) Regarding the title, you should aim for describing what the writing is about, not where the writing came from. Unfortunately I can personally only generate boring titles like “Essays On Thinking Straight”.
Quantum mechanics and Metaethics are what initially drew me to LessWrong. Without them, the Sequences aren’t as amazingly impressive, interesting, and downright bold. As solid as the other content is, I don’t think the Sequences would be as good without these somewhat more speculative parts. This content might even be what really gets people talking about the book.
Maybe we could test that. Does LessWrong keep non-anonymous access logs? If so, we may be able to (approximately?) reconstruct access patterns over the weeks/months/years by unique user. We could know:
What are the first reads of newcomers?
What are typical orders of reading?
Does reading stops, when, and where?
For instance, if we find that people that start by the quantum mechanic sequence tend to leave more often than the others, then it is probably a good idea to segregate it in a separate volume. It would at least signal that the author knows this is advanced or controversial.
Google has a term for pages that people come in on: “landing pages”. Basically, it can tell whether someone got to the page from clicking an external link / advertisement or by using a search engine—or whether they clicked a link from within the same site.
“Timeless Physics” is in the 50 most popular landing pages and so is “An intuitive explanation of quantum mechanics” (though it is not a sequence). I am not seeing any pattern to the topics that people prefer in these landing pages. I can tell you this though, all the top 50 landing pages have terrible bounce rates (meaning people leave the site without clicking further), usually 80% or 90%.
“What are typical orders of reading?”
Analytics has something like this, but it’s not specific to the sequences, so it basically shows people coming in on the main page, checking out discussions or maybe an article, going to the user sections or discussions or maybe a different article, and so on. It’s not really useful for figuring this out.
“Does reading stops, when, and where?”
Everywhere. Most of the pages I’ve seen on there have an 80% or 90% bounce rate. The question here is what pages do they NOT quit reading on?
restricts the analytics view to landing pages with < 60% bounce rate and orders them by total visits
Well, look at that. “The Quantum Physics Sequence” is the first sequence page in the list. The next piece of writing is “Harry Potter and the Methods of Rationality” which has a somewhat lower bounce rate but not nearly as many visits.
If it were me, I’d split your list after reductionism into a separate ebook. Everything that’s controversial or hackles-raising is in the later sequences. A (shorter) book consisting solely of the sequences on cognitive biases, rationalism, and reductionism could be much more a piece of content somebody without previous rationalist intentions can pick up and take something valuable away from. The later sequences have their merits, but they are absolutely counterproductive to raising the sanity waterline in this case. They’ll label your book as kooky and weird, and they don’t, in themselves, improve their readers enough to justify the expense. People interested in the other stuff can get the companion volume.
You could label the pared down volume something self helpey like ‘Thinking Better: The Righter, Smarter You.” For goodness sake, don’t have the word ‘sequences’ in the title. That doesn’t mean anything to anyone not already from LW, and it won’t help people figure out what it’s about.
EDIT: Other title suggestions—really just throwing stuff at the wall here
Rationality: Art and Practice
The Rational You
The Art of Human Rationality
Black Belt Bayesian: Building a Better Brain
The Science of Winning: Human Rationality and You
Science of Winning: The Art and Practice of Human Rationality (I quite like this one)
Oh, and somebody get Yudkowsky an editor. I love the sequences, but they aren’t exactly short and to the point. Frankly, they ramble. Which is fine if you’re just trying to get your thoughts out there, but people don’t finish the majority of the books they pick up. You need something that’s going to be snappy, interesting, and cater to a more typical attention span. Something maybe half the length we’re looking at now. The more of it they get through, the more good you’re doing.
EDIT: Oh! And the whole thing needs a full jargon palette-swap. There’s a lot of LW-specific jargon that isn’t helpful. In many cases, there’s existing academic jargon that can take the place of the phrases Yudkowky uses. Aside from lending the whole thing a superficial-but-useful veneer of credibility, it’ll make the academics happy, and make them less likely to make snide comments about your book in public fora. If you guys aren’t already planning on a POD demand run, you really should. Ebooks are wonderful, but the bulk of the population is still humping dead trees around. An audiobook or podcast might be useful as well.
For most part the sequences define said jargon, rather than using it.
First: Ahahahaha. No. The sequences use jargon. They use a lot of it. Approximately every single hyperlink in the Sequences is there to define jargon. This, by the way, works wonderfully when you have hyperlinks to remind you what the heck all the jargon means. Which leads me to #2...
Second: Print books, lacking hyperlinks, will need a glossary
Third: You are not allowed to use the chapter index as said glossary
I think what this person is saying is “We already have words for a lot of these things so Eliezer and the other people who wrote the sequences are making up their own words for no reason.”
The sequences would get more hits from search engines by using existing terms, LWers and other flavors of rationalists would be able to communicate more easily, and people would have to remember less terms overall if he just uses existing terms.
(Edit: added “and other people” to above sentence about making up unnecessary words for the sequences)
Example: Luminosity is a new word for the existing concept “self-awareness”
(Edit: “self-awarness” used to say “metacognition”).
A new annoying word. I’d nominate it for “worst local jargon (single-word division)”.
Does any other unnecessary jargon come to mind?
“Luminosity” is best glossed as “self-awareness”, not “metacognition”. Also, Eliezer didn’t make it up (and I didn’t make it up from scratch), so bringing it up under the grandparent is peculiar.
Since metacognition is thinking about your thoughts, and self-awareness means introspecting, which means thinking about your thoughts (or self-awareness can mean being aware that you have a separate personality, a self, which isn’t how you use it), I would say these could be synonymous. To test whether my perception here is wrong, I went to wikipedia, Google, and the dictionary to see how they used those words. Here’s what I discovered:
Both the internet and you are using self-awareness in a broader way to encompass emotions and states of mind. Metacognition is more narrow and is something I personally use more frequently when I’m referring to re-engineering my thought processes. I think the reason I initially selected “meta cognition” as a term to suggest is because the sequences are, to me, an invitation to think about thinking and re-engineer one’s thinking processes, so I was interpreting your luminosity concept within that context. Also, the way that I introspect about my feelings and experiences, I’m pretty focused on getting down to the thoughts behind everything and re-engineering them. The way I experience self-awareness, self-awareness and metacognition are inseparable and may as well be synonymous, but I acknowledge that other people may do it differently.
If your focus is more on the emotional / state of mind type aspects (I have not read all of your luminosity articles to be able to see the patterns in how you use it) self-awareness is probably a closer synonym for luminosity.
I also discovered that you have one of the top five Google results for luminosity. Good job. However, I think you’re more likely to show up when people are looking for light bulbs or similar than when they are looking for self-improvement materials, so in my view it was not the most optimal term to SEO.
I took the word “luminosity” from “Knowledge and its Limits” by Timothy Williamson, although I’m using it in a different sense than he did.
I see that, but I did not say that Eliezer came up with that specific example.
Because luminosity is a term used in the sequences, it’s relevant. Because it’s an example of a word that’s synonymous with a much more common word:
It is definitely relevant as an example of uneccessary jargon in the sequences.
It would not support the point that Eliezer is coming up with unneccessary jargon, but that’s not my point, my real point is that the sequences have unneccessary jargon. That is what’s important here.
Since the sequences would get more hits from search engines by using existing terms, LWers and other flavors of rationalists would be able to communicate more easily, and people would have to remember less terms overall if existing terms were used, do you think it would be of greater benefit to use “self-awareness” instead?
Note: For SEO reasons it may be safer to add the term self-awareness to the title rather than replace luminosity. (When you change your title entirely, it may look like search engine spamming and get you spam penalties. I will check whether this specific technique worked if you’re interested.)
One could also say that ‘meta-cognition’ allows one to arrive at self-awareness.
I dunno, I think it might go the other way around actually. As tempting as it is to continue this, I’m becoming aware of the fact that this has resulted in a bunch of people talking about wording and I am not sure whether there’s a point in discussing this further. I do want to discuss my ideas to slow Moore’s Law though.
Merely modifying the title may also look like keyword stuffing.
I work with an SEO professionally. I was advised to do this by an SEO on another website. If she, (or anyone doing term replacements on the sequences) is interested, I will do the extra step of checking how well this tweak worked on the other site.
(Edited for bad phrasing due to down votes.)
I’m not sure whether what looks like rambling is actually an effective method of easing people into the ideas so that the ideas are easier to accept, rather than just being inefficient.
Is there any way to find out?
In general, when something can be either tremendously clever, or a bit foolish, the prior tends to the latter. Even with someone who’s generally a pretty smart cookie. You could run the experiment, but I’m willing to bet on the outcome now.
It’s important to remember that it isn’t particularly useful for this book to be The Sequences. The Sequences are The Sequences, and the book can direct people to them. What would be more useful would be a condensed, rapid introduction to the field that tries to maximize insight-per-byte. Not something that’s a definitive work on rationality, but something that people can crank through in a day or two, rave about to their friends, and come away with a better idea of what rational thinking looks like. It’d also serve as a less formidable introduction for those who are very interested, to the broader pool of work on the subject, including the Sequences. Dollar for sanity-waterline dollar, that’s a very heavily leveraged position.
Actually, if CFAR isn’t going to write that book, I will.
I’m currently writing a summary of each sequence as I read them. I am doing this because it helps me to remember what I read. What is going to result from my doing this is a Cliff’s notes version of the sequences.
If you were going to do something similar anyway, I might as well just post these notes when I am done to save you the work. Would that serve the purpose you were thinking of? Or is your idea significantly different?
I’m not sure if we will need these, though you should definitely put your summaries in the LW wikI!
Hmm okay. Maybe I will do just that. (:
Suggested title: The Tao of Bayes
Ideally it should not be significantly longer than “The Tao of Pooh”
I’d be half-tempted to try my hand at it myself...
So far, I’m twenty pages in, and getting close to being done with the basic epistemology stuff.
Regarding the jargon, I agree with wedrifid that LW-specific jargon is actually being defined as the sequences, and from what I’ve heard and experienced this is extremely helpful in setting down a common language for us to discuss these matters.
However, there is some jargon that could and probably should be done away with: the computer science stuff. Not all sequences/articles have it, but when it’s there it’s usually several levels of inference away from laypeople. The CS/programming examples, comparisons and metaphors are fun for someone like me, but it’s an accepted matter among IT people that things like the XKCD comic on a random function that always returns 4 will not help get the point across to non-IT people.
I’m sure that has been mentioned before, but it’s worth making sure that it’s looked over and that while doing it you remember that when writing educative material, most people severely overshoot the level that they’re aiming for, and end up writing a text that’s perfect for undergrads when they were targeting a middle school audience or somesuch.
Personally, I’d leave in most of the random intercultural references (like the anime references, for instance) since I suspect they’d still reach a good portion of the audience and wouldn’t have negative impact, but that’d be up for discussion. This also gives me an idea, but I’ll make a separate comment for it.
On the one hand, I agree with you. If people can’t understand, then that’s bad. On the other hand, touches like those give the sequences personality, and that personality may be part of what makes them popular.
Usually, though, there’s a way to phrase IT descriptions such that everyone can understand. I do this for my boss all the time. Maybe giving it a high-tech “personality” and making it comprehensible are not mutually exclusive.
Yes, I agree.
The IT culture stuff is good, what I think wouldn’t pass is specific IT vocabulary or concepts that don’t get introduced within the sequences and that is assumed to be understood.
I seem to recall a reference somewhere of object-level vs class-level distinctions, and someone who’s never heard about OOP would have no idea that we’re basically talking about the programming equivalent of specific emails vs email templates (or “an email”, or whatever helps make them understand, but I’ve found the specifc email vs template example sufficient as a first step for most people I’ve had to explain this to).
Yup, this is planned. It may be that SI publishes the full Sequences thing, and CFAR publishes the cut-down version (with a new introduction by Eliezer, or something).
.
Would you please contact malo@intelligence.org? Thanks!
.
That sounds like exactly the correct split.
That one is taken.
My mistake.
I agree completely with this comment (assuming that the ebook is aimed at people who aren’t already familiar with LW.) Regarding the title, you should aim for describing what the writing is about, not where the writing came from. Unfortunately I can personally only generate boring titles like “Essays On Thinking Straight”.
Quantum mechanics and Metaethics are what initially drew me to LessWrong. Without them, the Sequences aren’t as amazingly impressive, interesting, and downright bold. As solid as the other content is, I don’t think the Sequences would be as good without these somewhat more speculative parts. This content might even be what really gets people talking about the book.
Maybe we could test that. Does LessWrong keep non-anonymous access logs? If so, we may be able to (approximately?) reconstruct access patterns over the weeks/months/years by unique user. We could know:
What are the first reads of newcomers?
What are typical orders of reading?
Does reading stops, when, and where?
For instance, if we find that people that start by the quantum mechanic sequence tend to leave more often than the others, then it is probably a good idea to segregate it in a separate volume. It would at least signal that the author knows this is advanced or controversial.
From Google analytics:
Google has a term for pages that people come in on: “landing pages”. Basically, it can tell whether someone got to the page from clicking an external link / advertisement or by using a search engine—or whether they clicked a link from within the same site.
“Timeless Physics” is in the 50 most popular landing pages and so is “An intuitive explanation of quantum mechanics” (though it is not a sequence). I am not seeing any pattern to the topics that people prefer in these landing pages. I can tell you this though, all the top 50 landing pages have terrible bounce rates (meaning people leave the site without clicking further), usually 80% or 90%.
“What are typical orders of reading?”
Analytics has something like this, but it’s not specific to the sequences, so it basically shows people coming in on the main page, checking out discussions or maybe an article, going to the user sections or discussions or maybe a different article, and so on. It’s not really useful for figuring this out.
“Does reading stops, when, and where?”
Everywhere. Most of the pages I’ve seen on there have an 80% or 90% bounce rate. The question here is what pages do they NOT quit reading on?
restricts the analytics view to landing pages with < 60% bounce rate and orders them by total visits
Well, look at that. “The Quantum Physics Sequence” is the first sequence page in the list. The next piece of writing is “Harry Potter and the Methods of Rationality” which has a somewhat lower bounce rate but not nearly as many visits.