Rationalist fiction: a Slice of Life IN HELL
“If you’re sent to Hell for that, you wouldn’t have liked it in Heaven anyway.”
This phrase inspired in me the idea of a Slice of Life IN HELL story. Basically, the strictest interpretation of the Abrahamic God turns out to be true, and, after Judgment Day, all the sinners (again, by the strictest standards), the pagans, the atheists, the gays, the heretics and so on end up in Hell, which is to say, most of humanity. Rather than a Fire and Brimstone torture chamber, this Hell is very much like earthly life, except it runs on Murphy’s Law turned Up To Eleven (“everything that can go wrong, will go wrong”), and you can’t die permanently, and it goes on forever. It’s basically Life as a videogame, set to Maximum Difficulty, and real pain and suffering.
Our stories would focus actually decent, sympathetic people, who are there for things like following the wrong religion, or having sex outside missionary-man-on-woman, lack of observance of the daily little rituals, or even just being lazy. They manage to live more-or-less decently because they’re extremely cautious, rational, and methodical. Given that reality is out to get them, this is a constant uphill battle, and even the slightest negligence can have a terrible cost. Thankfully, they have all the time in eternity to learn from their mistakes.
This could be an interesting way to showcase rationalist principles, especially those regarding safety and planning, in a perpetual Worst Case Scenario environment. There’s ample potential for constant conflict, and sympathetic characters whom the audience can feel they really didn’t deserve their fate. The central concept also seems classically strong to me: defying Status Quo and cruel authorities by striving to be as excellent as one can be, even in the face of certain doom.
What do you guys think? There’s lots of little details to specify, and there are many things that I believe should be marked as “must NOT be specified”. Any help, ideas, thoughts are very welcome.
I don’t see much good in associating rationality with extreme caution.
I am also not sure of how much use will rationality be in a world where reality is out to get you. It is merely unlikely that all the air molecules in a room will decide to be elsewhere leaving hard vacuum in their place—or that you fall a hundred feet underground through solid rock and then remain entombed there for a REALLY long time...
How improbable the bad outcomes are would depend on which level of Hell you’re into, perhaps?
Wildbow of the Worm fame describes something like that in Pact. The protagonist is unwillingly thrust into the world where he inherited a karmic debt from previous generations and so
The serial is not explicitly “rationalist”, but irrational (in that universe) decisions bite him in the rear pretty quickly. And so do rational decisions sometimes. Even your best actions against a hostile universe can only get you so far.
Is this similar to what you had in mind?
Indeed. Although, frankly, what I’ve seen of Worm so far seems to designate it as very similar to my idea of Hell; every accomplishment is either made moot or cost something irreplaceable and possibly of superior value, every victory is short-lived, every mistake is paid for dearly. Every situation is desperate, every problem urgent. By the time a conflict reaches its resolution, another is at its peak, and two more are right around the corner. Perhaps it’s even worse; hardship, instead of building character, corrupts it.
For the characters, it must be like a nightmare they can’t wake up from.
Yeah, Worm is pretty bleak. I tend to find that a bit overwhelming at times myself; I like the series because of its other strengths (diverse and interesting characters, intelligent plotting, deep and rich setting) with the oppressive tone being a small strike against it for me.
...And now I just spent most of my workday reading Pact. Upvoted for awesome. Thanks. :-)
Please do note the delicious irony here :
Which in essence looks suspiciously like cautiously assuming a bad case scenario in which this story won’t help the rationality cause, or even a worst case scenario in which it will do more wrong than right.
If you want to go forth and create a story about rationality, then do it. Humans are complex creatures, not everyone will react the same way to your story, and anybody who thinks they can accurately predict the reaction of all the different kinds of people who’ll read your story (especially as this story hasn’t even been written yet) is either severely deluded as to their ability, or secretly running the world behind curtains already.
That’s me allright. Heck, now that the examples of Hellcity, Worm and Pact have been brought up, I feel like such a work would be redundant.
A silly question: Of all the recurring “employee” characters in Dilbert, which one is reacting most rationally to the situation they’re in? Probably Wally...
In terms of little details, I think right away “Everything that can go wrong, will go wrong” must be specified, because if you let rationalists try to think “How bad could it be at maximum badness?” it will get very bad, very quickly.
For instance Situation 1: Imagine every day you spend most of the time outside, you get struck by lightning, and every day you spend most of the time inside, there is an earthquake and whatever you are inside collapses on you.
I can see Rationalist attempting to make and spend most of their time in structures made mostly out of pillows: They collapse, oh well, they get rebuilt in 30 minutes, It turns pain into a daily chore.
On the other hand, Imagine Situation 2: Every day through hellish quantum mechanics enough anti-matter appears in contact with your skin to cause a non fatal, but excruciating, matter-antimatter reaction explosion.
Now, at this point, the rationalist might realize something like “Okay, well, I’ll arrange things in such a way that any explosion will fit into one of two categories: It will be fatal, or it won’t actually cause me pain.”
And while the rationalist is attempting to build the arrangement that does this, A giant bear comes by and breaks it and painfully claws them into pieces (nonfatally).
Situation 3: Rationalists can be rationalist all they want, but they’ve been captured by the giant bears and had all of their limbs systematically clawed off, plus they’ve been blindfolded, gagged, earplugged, and are periodically used as claw sharpeners.
Of course, if some parts of hell are like situation 1, and some parts of hell are like situation 2, and some are like situation 3, I expect rationalists to attempt to figure out why that is, unless you want to have Situation 4:
Situation 4: There’s one constant rule of Hell: Every someone figures out all of the other rules of hell, those rules change.
Ergo: Once someone figures out “Oh, well, I can avoid the Lightning and the Earthquakes with pillow structures.” then the Giant Bears and Antimatter Skin Explosions come. Once you figure out how to get used to being used as a Giant Bear claw sharpener, something else happens, and that thing is even worse.
Basically, there is a range of darkness you can have here, in terms of writing. In terms of difficultly levels, this might be expressed as:
1: Hard.
2: Impossible.
3: You’re helpless.
4: Struggling can only make it worse.
I was writing a story about a character starting at rock bottom and working their way up, and I actually had the entity setting this up mention to the character that there had been previous versions of the character that just went irrevocably insane, and were deleted and reset because previous versions of ‘rock bottom’ had been set to low to ever get out.
After learning the constant rule, you find a ruleset that doesn’t seem too awful, and then don’t learn it.
Hell is trying to abstain from pattern matching.
Reminds me of talesofmu. Your strategy looks like trying to play the GM, and is likely to get you punished :)
Wouldn’t that count as learning a rule and cause the meta-level rules to change to something worse if you started using your knowledge to make it more tolerable?
it might do. You’d have to check by writing it, of course.
Ehh… As the other commenters are saying, it’s unclear how it would promote rationality, or what its Ultimate Effect would be...
But I think you should do it anyway. I’d read it.
The challenge is that rationalists should win, no matter what kind of environment they’re thrown in. One that’s out to screw them is only a middling challenge. Eventually, I’d like to tackle “how to be as rational/effective as possible in an actively irrational environmnent, such as the setting of The Sandman”.
Challenge to whom? To the omnipotent author? Doesn’t look much like a challenge (see the “omnipotent” bit). To the rationalists? It seems pretty obvious to me that there are environments where no winning is possible.
Establishing that winning is impossible is already a win of sorts. And writers are hardly omnipotent; we are governed by the stringent rules of Good Writing. An author who abuses their power willy-nilly only creates an unpersuasive mess that immerses and captivates absolutely no-one, and can hardly be said to be fiction at all.
Only if you choose to be so :-)
It’s not just choice; you have to learn them and interiorize them and they’re subjective. Grant Morrison and Alan Moore and Neil Gaiman can write incredibly confusing, irrational, impossible stories that nevertheless are plausible and gripping and immersive. This took them decades of experience. Your beginner fanfic writer, no matter how well-intentioned and studious, will fail on some fundamental level. Check out EY’s earliest fiction out there; it’s pretty damn terrible.
Where does one find this terrible early fiction?
Hellcity by Macon Blair and Joe Flood is as you describe and a good read.
I don’t think that teaching people to expect worse case scenarios increases rational thinking.
Not expecting them, but anticipating them. Anticipating how things can go wrong, and pre-empting that. Like Harry buying that medikit. Although it turned out to be useless, because he hadn’t been prepared to do what it took to keep his friends safe.
It’s more rational to make expected utility calculations than trying to cover yourself against every worse case scenario that you can imagine.
See pre-mortem.