Does every rationalist protagonist come out of the box thinking they’re the queen or king?
I dunno, but I’ve never read one that doesn’t.
If you considered yourself able to take over the world (which all ultra-rationalist characters (whether pro- or antagonists) seem to) then actually taking it over would be one of the most rational things you could do.
If you considered yourself able to take over the world (which all ultra-rationalist characters (whether pro- or antagonists) seem to)
This is the part that confuses me, though. Maybe it’s just because I’ve studied economics and/or am a libertarian, but as soon as you realize that the world is ordered by systems instead of by people then the idea of taking over the world is ludicrous. There’s a great Steve Jobs quote-
When you’re young, you look at television and think, There’s a conspiracy. The networks have conspired to dumb us down. But when you get a little older, you realize that’s not true. The networks are in business to give people exactly what they want. That’s a far more depressing thought. Conspiracy is optimistic! You can shoot the bastards! We can have a revolution! But the networks are really in business to give people what they want. It’s the truth.
And so, when I see someone who thinks they can, let alone should, take over the world, the first impression I get is the “d’awww” that mature adults feel when they see a pretentious teenager. And maybe stories about pretentious teenagers are valuable? But I found that I became a mature adult by, among other things, reading about mature adults. And it seems to me that a pretty important precondition to rationality is being a mature adult (or, perhaps more correctly, the idea of being rational and the idea of mature adulthood are strongly connected).
I agree on all points. But tales of ultra-rationalists (at least when they’re protagonists) are wish-fulfillment stories. That’s why they’re fun to read. If they were realistic then they’d quickly end up with the protagonist shattered, her loved ones killed or scattered, and the ruling systems just as entrenched as ever (if not even stronger).
Which is why I’m pretty sure Elspeth will succeed where Bella failed. These sorts of stories aren’t meant to be accurate reflections of the real world, they’re meant to give a vicarious thrill as the hero finally brings down all those bastards that are screwing things up for the rest of us. If I wanted a realistic story that ended in the crushing of the human soul under the heel of an inhuman system I’d watch the news.
There is, of course, a third option. The rationalist who sets their sights on something human-scaled instead of humanity-scaled is likely to do very well for themselves.
And so, in some sense, it’s worth examining the scope and effect of wish-fulfillment stories. If I play a lot of video games where I’m the only relevant character who reshapes all of reality around him according to my whims, what does that do to my empathy? My narcissism? My ability to reshape reality, and my satisfaction with the changes I attempt? If I read a lot of books about the lives of software pioneers and their companies, what does that do to my empathy? My narcissism? My ability to reshape reality, and my satisfaction with the changes I attempt? If I read a lot of books about successful relationships, how people work, and how to control myself, what does that do to my empathy? My narcissism? My ability to reshape reality, and my satisfaction with the changes I attempt?
It’s difficult to write a story about start-ups. (Either the idea is good and has been done, so you’re writing history instead of fiction, or good and not done, in which case you should be doing it not writing about it, or bad, in which case disbelief will be hard to suspend.) But it’s easy to see someone using rationality to turn around their relationship or their life or a school or business.
The author’s problem is twofold: those problems are hard, and those problems are local. Stories tend to go for the cheapest thing: the cheapest/simplest plot is one person punching another, and the cheapest emotional hook is the fate of the world.
But those problems are solvable. And I, as suggested, would love a rationalist story where the hero devotes their time to solving useful, even if they are limited, problems instead of figuring out the best way to punch someone. You can see this in HP:MoR: compare the chapters where Harry is trying to figure out magic, or convert Draco, to the Azkaban chapters. (I am in the early early stages of starting a rationalist work on these lines; I abandoned fiction writing ~6 years ago and do not expect to be good at it, but we’ll see if I get happy enough with it to show the public.)
Which is why I’m pretty sure Elspeth will succeed where Bella failed.
My thoughts here are that, for Elspeth, succeeding is having an accurate idea of what scope she can change the world at. That was Bella’s core failure: her delusions of grandeur. For Elspeth, who has a less useful power, no ultra-rich family with several massively useful witches, and only half-vampire status, for her to defeat the Volturi where Bella failed seems to me to be impossible (unless she does the “well, I’m going to steal a bunch of money, buy a bunch of explosives, and burn Volterra to the ground” plan, which is definitely not the utilitarian way to conduct regime change).
I don’t mind that this lesson, which is of critical importance, is a really painful one. Pain is the best teacher. I mind that Bella didn’t know it beforehand, but it’s a reasonable flaw to give a character (especially if you’re writing for the LW community, apparently). But if Alicorn has the second book narrated by Elspeth and she makes the same mistakes as Bella (especially if she lucks into a win), then I will stop reading in disgust.
It’s difficult to write a story about start-ups. (Either the idea is good and has been done, so you’re writing history instead of fiction, or good and not done, in which case you should be doing it not writing about it, or bad, in which case disbelief will be hard to suspend.)
Unless it’s been done in the real world but not in the world you’re writing in, in which case you may be Terry Pratchett.
Put in a separate post: I am strongly considering writing a top-level post about the failings of utilitarianism, because I see that as very strongly linked to Bella’s scope failure (the utilitarian goal is the Volturi gone, thus I should eradicate the Volturi). I’ll also write it for people, not for fictional characters, if that’s a worry.
If you are interested in seeing my thoughts on the matter, vote this up; if disinterested, vote this down. (But not negative, please, my karma is tiny!)
I would strongly prefer that my characters not be used as examples in non-fiction didactic works at least until I announce that I have finished with the story. (I currently expect Radiance to be the last work I do in the universe.)
This is a reply deep in a thread on a relatively old post, you won’t likely get many people that even see this request. :) If you’re nervous about publishing a top-level post, at least float something on the discussion side. I agree that utilitarianism is severely flawed. My reason is that humans simply don’t have enough computing power to ever implement utilitarianism decently, it would take an entity with orders of magnitude more intellectual strength to be a utilitarian with falling into a Bella spiral.
Deontology that is steered by utilitarian goals and occasionally modified by utilitarian analysis, OTOH, seems very workable and keeps the best of utilitarianism while factoring for human realities (but I’ve been plugging Desirism for a while now).
If you’re nervous about publishing a top-level post, at least float something on the discussion side.
That indeed looks exactly like what I was looking for: I had seen people use the pattern I modeled reading through comments, which were probably from before that got implemented.
I abandoned fiction writing ~6 years ago and do not expect to be good at it, but we’ll see if I get happy enough with it to show the public
Sounds interesting, you should put it up somewhere regardless. Because A) it can’t be that bad, and B) unless you’ve got a public reading it and demanding more, you’ll very likely never actually go forward with this. :)
If you considered yourself able to take over the world (which all ultra-rationalist characters (whether pro- or antagonists) seem to) then actually taking it over would be one of the most rational things you could do.
But not one of the wisest (because most people who have taken over suddenly realize exactly how much of a pain it is to actually run the world that you’ve just taken over)
I dunno, but I’ve never read one that doesn’t.
If you considered yourself able to take over the world (which all ultra-rationalist characters (whether pro- or antagonists) seem to) then actually taking it over would be one of the most rational things you could do.
This is the part that confuses me, though. Maybe it’s just because I’ve studied economics and/or am a libertarian, but as soon as you realize that the world is ordered by systems instead of by people then the idea of taking over the world is ludicrous. There’s a great Steve Jobs quote-
And so, when I see someone who thinks they can, let alone should, take over the world, the first impression I get is the “d’awww” that mature adults feel when they see a pretentious teenager. And maybe stories about pretentious teenagers are valuable? But I found that I became a mature adult by, among other things, reading about mature adults. And it seems to me that a pretty important precondition to rationality is being a mature adult (or, perhaps more correctly, the idea of being rational and the idea of mature adulthood are strongly connected).
I agree on all points. But tales of ultra-rationalists (at least when they’re protagonists) are wish-fulfillment stories. That’s why they’re fun to read. If they were realistic then they’d quickly end up with the protagonist shattered, her loved ones killed or scattered, and the ruling systems just as entrenched as ever (if not even stronger).
Which is why I’m pretty sure Elspeth will succeed where Bella failed. These sorts of stories aren’t meant to be accurate reflections of the real world, they’re meant to give a vicarious thrill as the hero finally brings down all those bastards that are screwing things up for the rest of us. If I wanted a realistic story that ended in the crushing of the human soul under the heel of an inhuman system I’d watch the news.
There is, of course, a third option. The rationalist who sets their sights on something human-scaled instead of humanity-scaled is likely to do very well for themselves.
And so, in some sense, it’s worth examining the scope and effect of wish-fulfillment stories. If I play a lot of video games where I’m the only relevant character who reshapes all of reality around him according to my whims, what does that do to my empathy? My narcissism? My ability to reshape reality, and my satisfaction with the changes I attempt? If I read a lot of books about the lives of software pioneers and their companies, what does that do to my empathy? My narcissism? My ability to reshape reality, and my satisfaction with the changes I attempt? If I read a lot of books about successful relationships, how people work, and how to control myself, what does that do to my empathy? My narcissism? My ability to reshape reality, and my satisfaction with the changes I attempt?
It’s difficult to write a story about start-ups. (Either the idea is good and has been done, so you’re writing history instead of fiction, or good and not done, in which case you should be doing it not writing about it, or bad, in which case disbelief will be hard to suspend.) But it’s easy to see someone using rationality to turn around their relationship or their life or a school or business.
The author’s problem is twofold: those problems are hard, and those problems are local. Stories tend to go for the cheapest thing: the cheapest/simplest plot is one person punching another, and the cheapest emotional hook is the fate of the world.
But those problems are solvable. And I, as suggested, would love a rationalist story where the hero devotes their time to solving useful, even if they are limited, problems instead of figuring out the best way to punch someone. You can see this in HP:MoR: compare the chapters where Harry is trying to figure out magic, or convert Draco, to the Azkaban chapters. (I am in the early early stages of starting a rationalist work on these lines; I abandoned fiction writing ~6 years ago and do not expect to be good at it, but we’ll see if I get happy enough with it to show the public.)
My thoughts here are that, for Elspeth, succeeding is having an accurate idea of what scope she can change the world at. That was Bella’s core failure: her delusions of grandeur. For Elspeth, who has a less useful power, no ultra-rich family with several massively useful witches, and only half-vampire status, for her to defeat the Volturi where Bella failed seems to me to be impossible (unless she does the “well, I’m going to steal a bunch of money, buy a bunch of explosives, and burn Volterra to the ground” plan, which is definitely not the utilitarian way to conduct regime change).
I don’t mind that this lesson, which is of critical importance, is a really painful one. Pain is the best teacher. I mind that Bella didn’t know it beforehand, but it’s a reasonable flaw to give a character (especially if you’re writing for the LW community, apparently). But if Alicorn has the second book narrated by Elspeth and she makes the same mistakes as Bella (especially if she lucks into a win), then I will stop reading in disgust.
Unless it’s been done in the real world but not in the world you’re writing in, in which case you may be Terry Pratchett.
Put in a separate post: I am strongly considering writing a top-level post about the failings of utilitarianism, because I see that as very strongly linked to Bella’s scope failure (the utilitarian goal is the Volturi gone, thus I should eradicate the Volturi). I’ll also write it for people, not for fictional characters, if that’s a worry.
If you are interested in seeing my thoughts on the matter, vote this up; if disinterested, vote this down. (But not negative, please, my karma is tiny!)
I would strongly prefer that my characters not be used as examples in non-fiction didactic works at least until I announce that I have finished with the story. (I currently expect Radiance to be the last work I do in the universe.)
Understood and anticipated, though I certainly could have been clearer. I would write what I know- anarchism- not what I don’t know- your characters.
This is a reply deep in a thread on a relatively old post, you won’t likely get many people that even see this request. :) If you’re nervous about publishing a top-level post, at least float something on the discussion side. I agree that utilitarianism is severely flawed. My reason is that humans simply don’t have enough computing power to ever implement utilitarianism decently, it would take an entity with orders of magnitude more intellectual strength to be a utilitarian with falling into a Bella spiral.
Deontology that is steered by utilitarian goals and occasionally modified by utilitarian analysis, OTOH, seems very workable and keeps the best of utilitarianism while factoring for human realities (but I’ve been plugging Desirism for a while now).
That indeed looks exactly like what I was looking for: I had seen people use the pattern I modeled reading through comments, which were probably from before that got implemented.
Sounds interesting, you should put it up somewhere regardless. Because A) it can’t be that bad, and B) unless you’ve got a public reading it and demanding more, you’ll very likely never actually go forward with this. :)
But not one of the wisest (because most people who have taken over suddenly realize exactly how much of a pain it is to actually run the world that you’ve just taken over)
Exactly; Baron Wulfenbach from Girl Genius comes to mind, or the The Onion article entitled “Black Man Given America’s Worst Job.”