If there’s a point to that plot-thread-summary, I guess it’s in your final sentence:
The lesson is then that reason is best, and instinct will do, but something in-between, instinct plus a crippled reason that takes itself as seriously as if it were the real thing, leads to madness.
If this is just a proposed way of reading Winterson’s text, it has no relevance to LessWrong and would be better directed to a literary magazine. If you think that it is a true statement about the proper relation between reason and instinct, then you need to actually say so, and say why, because at present you are neither asserting it, nor providing any reason to believe it.
If you were to do so, that would be relevant, although it amounts to Spock-style rationality, which I doubt will play any better.
LessWrong needs to deal with emotions as part of rationality. Strangely, people are eager to upvote Julia Galef’s post on the importants of emotions in rationality, yet eager to downvote my attempt to deal with emotion and rationality on LessWrong.
LessWrong needs to deal with emotions as part of rationality.
Certainly.
Strangely, people are eager to upvote Julia Galef’s post on the importants of emotions in rationality, yet eager to downvote my attempt to deal with emotion and rationality on LessWrong.
Don’t spend >90% of your word count summarizing a novel next time.
The last paragraph was interesting, and at least some of the setup was required for this specific point, but it felt like a very low signal-to-”why am I reading all these excerpts from a seemingly-arbitrarily-selected 1992 novel” ratio.
Basically, I finished the article feeling like I had a pretty good idea what happened in the novel, but very little new insight into the combination of love and reason, or even what PhilGoetz thinks about it.
I appreciate your explanation, but I don’t think you understand how novels work. They are not logical arguments that can be summarized in a one-paragraph conclusion. If you want to take emotions seriously, you need to speak their language. You can’t do it all analytically.
Eliezer’s article is about people taking scenarios from science fiction about artificial intelligence as evidence of what artificial intelligence is like. In a story like this one, the summary itself is the evidence, and I can’t analyze it and explain it to you in anything shorter than a plot summary. If I could, it would be a bad novel. The purpose of this type of novel, as opposed to a Terminator action-adventure flick, is to explore things that are too complex for us to analyze. Any novel that could be analyzed in the way you’re suggesting would be a bad novel.
Just because that’s the specific focus in the article doesn’t mean that the point is so narrow. Just as it’s incorrect to suppose that a sci fi story gives us a useful picture of how society would be transformed by certain technologies, it’s also a mistake to conclude, for instance, that a story about a bunch of young boys stranded on an island who devolve into barbarism is a useful case study in human nature. The contents of the book never happened, it’s just something someone imagined, and to the extent that the author’s belief that such a thing might happen constitutes evidence, we can do better by looking at what reasons a person would have to believe it in the first place.
Any novel whose experience could be replicated via the process I described would be a bad novel, but what you’d be leaving out would not actually be evidence for the truth of the points the novel is contending.
I don’t think a longish summary was inappropriate. I’m not even sure the specific amount of summary you used was inappropriate—if I were an editor, I’d have an eye out for parts which you could get away with trimming, but that’s just editing in general.
I DO think there was too little unpacking and exploration of your thesis. The 1800 words of summary aren’t the problem, it’s that 200 words of analysis is pretty sparse.
Strangely, people are eager to upvote Julia Galef’s post on the importants of emotions in rationality, yet eager to downvote my attempt to deal with emotion and rationality on LessWrong.
Not strange at all. People presumably think her attempt is better than yours.
If there’s a point to that plot-thread-summary, I guess it’s in your final sentence:
If this is just a proposed way of reading Winterson’s text, it has no relevance to LessWrong and would be better directed to a literary magazine. If you think that it is a true statement about the proper relation between reason and instinct, then you need to actually say so, and say why, because at present you are neither asserting it, nor providing any reason to believe it.
If you were to do so, that would be relevant, although it amounts to Spock-style rationality, which I doubt will play any better.
LessWrong needs to deal with emotions as part of rationality. Strangely, people are eager to upvote Julia Galef’s post on the importants of emotions in rationality, yet eager to downvote my attempt to deal with emotion and rationality on LessWrong.
Certainly.
Don’t spend >90% of your word count summarizing a novel next time.
The last paragraph was interesting, and at least some of the setup was required for this specific point, but it felt like a very low signal-to-”why am I reading all these excerpts from a seemingly-arbitrarily-selected 1992 novel” ratio.
Basically, I finished the article feeling like I had a pretty good idea what happened in the novel, but very little new insight into the combination of love and reason, or even what PhilGoetz thinks about it.
I appreciate your explanation, but I don’t think you understand how novels work. They are not logical arguments that can be summarized in a one-paragraph conclusion. If you want to take emotions seriously, you need to speak their language. You can’t do it all analytically.
In that case, summarizing a novel may not be among the better ways to discuss emotions as part of rationality.
I think that if you want to raise the message of a book as a point in a discussion, it’s better to determine whether you have reasons for taking its contentions seriously beyond their use in an engaging story, and then explain those.
Eliezer’s article is about people taking scenarios from science fiction about artificial intelligence as evidence of what artificial intelligence is like. In a story like this one, the summary itself is the evidence, and I can’t analyze it and explain it to you in anything shorter than a plot summary. If I could, it would be a bad novel. The purpose of this type of novel, as opposed to a Terminator action-adventure flick, is to explore things that are too complex for us to analyze. Any novel that could be analyzed in the way you’re suggesting would be a bad novel.
Just because that’s the specific focus in the article doesn’t mean that the point is so narrow. Just as it’s incorrect to suppose that a sci fi story gives us a useful picture of how society would be transformed by certain technologies, it’s also a mistake to conclude, for instance, that a story about a bunch of young boys stranded on an island who devolve into barbarism is a useful case study in human nature. The contents of the book never happened, it’s just something someone imagined, and to the extent that the author’s belief that such a thing might happen constitutes evidence, we can do better by looking at what reasons a person would have to believe it in the first place.
Any novel whose experience could be replicated via the process I described would be a bad novel, but what you’d be leaving out would not actually be evidence for the truth of the points the novel is contending.
Apologies, I should clarify.
I don’t think a longish summary was inappropriate. I’m not even sure the specific amount of summary you used was inappropriate—if I were an editor, I’d have an eye out for parts which you could get away with trimming, but that’s just editing in general.
I DO think there was too little unpacking and exploration of your thesis. The 1800 words of summary aren’t the problem, it’s that 200 words of analysis is pretty sparse.
Not strange at all. People presumably think her attempt is better than yours.