You could actually write a non-fiction book about the incidents and beliefs within this community and tell people it is science fiction and they would review it as exaggerated fantasy.
I recently was considering writing a post-apocalyptic science fiction story where people are on a quest to find Roko’s deleted post, believing it to contain the key to defeating the tyrannical superintelligence.
How about one where people destroyed the Internet, burned all books and killed all academics to impede the dangerous knowledge cut loose by Roko. In the preface the downfall of the modern world would be explained by this. The actual story then would be set in the year 4110 when the world not just recovered but invented advanced AI and many other technologies we dreamed about today. The plot would be about a team of AI supported cyborg archaeologists on Mars discovering an old human artifact from the 2020′s, some kind of primitive machine that could be controlled from afar to move over the surface of Mars. When tapping its internal storage they are shocked. It looks like that the last upload from Earth was all information associated with the infamous Roko incident that lead to the self-inflicted destruction of the first technological civilisation over 2000 years ago. Sure, the archaeologists only know the name of the incident that lead a group of people to destroy the civilized society. But there’s a video too! Some Chinese looking guy can be seen, panic in his eyes and loud explosions in the background. Apparently some SIAI assault team is trying to take out his facility, as you can hear a repeated message coming in from a receiver, “We are the SIAI. Resistance is futile...” He’s explaining how he’s going to upload all relevant data to let the future know that it was all for nothing...then the video suddenly ends. Instantly long instantiated measures are taken to sandbox the data for further analysis. In the epilogue it is then told how people are aghast at how the ancients destroyed their civilisation over such blatant nonsense. How could have anyone taken those ideas serious for that every kid knows that a hard takeoff isn’t possible as there can only be a gradual development of artificial intelligence and that any technological civilization is merging with its machines rather than being ruled by them. Even worse, the ancients had absolutely no reason to believe that to create intelligences with incentive as broad as to allow for the urge to evolve is something that can be easily happen by failure, now people know that it has to grow and requires the cooperation of the world beyond you. And the moral of the story would be that the real risk is taking mere ideas too serious!
Maybe I’m generalizing from one example here, but every time I’ve imagined a fictional scenario where something I felt strongly about escalated implausibly into warfare, I’ve later realized that it was a symptom of an affective death spiral, and the whole thing was extremely silly.
That’s not to say a short story about a war triggered by supposedly-but-not-actually dangerous knowledge couldn’t work. But it would work better if the details of the knowledge in question were optimized for the needs of the story, which would mean it’d have to be fictional.
There are a stories about dangerous knowledge and stories about censorship gone mad, but I can’t think of one where the reader theirself isn’t sure which it is.
There’s a related concept in the stage production Urinetown, where the draconian controls of the police state turn out to have been necessary all along; and the Philip K. Dick short story The Golden Man, where the government’s brutal crackdown on mutants and sadistic experimentation are defied by a lone researcher, directly leading to implied cosmic waste.
But the closest story I can think of to ambiguous censorship is Scissors Cut Paper Wrap Stone, where the protagonist controls some Langford Basilisks; and censorship per se doesn’t play a big part in the plot.
There’s a related concept in the stage production Urinetown, where the draconian controls of the police state turn out to have been necessary all along
As a musical, Urinetown is okay, but its premise does not make sense. They have somehow managed, in spite of the water shortage and the wherewithal to institute massive societal change to manage it, to continue using restroom facilities that cost water, and they only don’t all die because they charge money to use those facilities, as though this will affect how much waste a person produces. This is all instead of a water-free facility, or better yet, reclamation.
And given that the Haber-Bosch process requires water (to produce the hydrogen gas), it seems a little stupid to ban public urination rather than simply insisting they urinate on trees or into buckets for their farmers to use.
I recently was considering writing a post-apocalyptic science fiction story where people are on a quest to find Roko’s deleted post, believing it to contain the key to defeating the tyrannical superintelligence.
Given what the post was alleged to do to its readers, it would be the most downer of all endings.
How about one where people destroyed the Internet, burned all books and killed all academics to impede the dangerous knowledge cut loose by Roko. In the preface the downfall of the modern world would be explained by this. The actual story then would be set in the year 4110 when the world not just recovered but invented advanced AI and many other technologies we dreamed about today. The plot would be about a team of AI supported cyborg archaeologists on Mars discovering an old human artifact from the 2020′s, some kind of primitive machine that could be controlled from afar to move over the surface of Mars. When tapping its internal storage they are shocked. It looks like that the last upload from Earth was all information associated with the infamous Roko incident that lead to the self-inflicted destruction of the first technological civilisation over 2000 years ago. Sure, the archaeologists only know the name of the incident that lead a group of people to destroy the civilized society. But there’s a video too! Some Chinese looking guy can be seen, panic in his eyes and loud explosions in the background. Apparently some SIAI assault team is trying to take out his facility, as you can hear a repeated message coming in from a receiver, “We are the SIAI. Resistance is futile...” He’s explaining how he’s going to upload all relevant data to let the future know that it was all for nothing...then the video suddenly ends. Instantly long instantiated measures are taken to sandbox the data for further analysis. In the epilogue it is then told how people are aghast at how the ancients destroyed their civilisation over such blatant nonsense. How could have anyone taken those ideas serious for that every kid knows that a hard takeoff isn’t possible as there can only be a gradual development of artificial intelligence and that any technological civilization is merging with its machines rather than being ruled by them. Even worse, the ancients had absolutely no reason to believe that to create intelligences with incentive as broad as to allow for the urge to evolve is something that can be easily happen by failure, now people know that it has to grow and requires the cooperation of the world beyond you. And the moral of the story would be that the real risk is taking mere ideas too serious!
Maybe I’m generalizing from one example here, but every time I’ve imagined a fictional scenario where something I felt strongly about escalated implausibly into warfare, I’ve later realized that it was a symptom of an affective death spiral, and the whole thing was extremely silly.
That’s not to say a short story about a war triggered by supposedly-but-not-actually dangerous knowledge couldn’t work. But it would work better if the details of the knowledge in question were optimized for the needs of the story, which would mean it’d have to be fictional.
There are a stories about dangerous knowledge and stories about censorship gone mad, but I can’t think of one where the reader theirself isn’t sure which it is.
There’s a related concept in the stage production Urinetown, where the draconian controls of the police state turn out to have been necessary all along; and the Philip K. Dick short story The Golden Man, where the government’s brutal crackdown on mutants and sadistic experimentation are defied by a lone researcher, directly leading to implied cosmic waste.
But the closest story I can think of to ambiguous censorship is Scissors Cut Paper Wrap Stone, where the protagonist controls some Langford Basilisks; and censorship per se doesn’t play a big part in the plot.
As a musical, Urinetown is okay, but its premise does not make sense. They have somehow managed, in spite of the water shortage and the wherewithal to institute massive societal change to manage it, to continue using restroom facilities that cost water, and they only don’t all die because they charge money to use those facilities, as though this will affect how much waste a person produces. This is all instead of a water-free facility, or better yet, reclamation.
And given that the Haber-Bosch process requires water (to produce the hydrogen gas), it seems a little stupid to ban public urination rather than simply insisting they urinate on trees or into buckets for their farmers to use.
The Pillowman and The Metal Children, both recent plays, come to mind.