In fiction, you have to make it up, but you can’t make it something implausible.
But any real scenario will seem implausible. That’s what the idea of singularity is about. If you believe that you can predict in any sense how the world will look like afterwards “singularity” is a very poor term to use.
Still, it can only mean ‘a whole lot less predictable than usual’, not ‘MAX ENTROPY ALL THE TIME’. Physics will still apply. Given that people survived and have lives worth writing stories about, we are at least within ‘critical failure’ distance of friendliness in AI. That narrows things very considerably.
A lot of the unpredictability of the singularity arises from a lack of proof of friendliness. One you’ve cleared that (or nearly), the range of possibilities isn’t singular in nature.
If there’s nothing I can write that wouldn’t break your Willing Suspension of Disbelief about events after an intelligence explosion, then there’s nothing I can write to do that, and nothing you can suggest to add to my story’s background; and both our times might be spent more productively (by our own standards) if we focus on our respective projects.
There are several working definitions for the term ‘singularity’. If the definition you use means that you think a story involving that word is inherently implausible, then one possibility would be to assume that where you see me write that term, I instead write, say, “That weird event where all the weird stuff happened that seemed a lot like what some of those skiffy authours used to call the ‘Singularity’”, or “Blamfoozle”, or anything else which preserves most of my intended meaning without forcing you to get caught up in this particular aspect of this particular word.
But any real scenario will seem implausible. That’s what the idea of singularity is about. If you believe that you can predict in any sense how the world will look like afterwards “singularity” is a very poor term to use.
I think it is a poor term.
Still, it can only mean ‘a whole lot less predictable than usual’, not ‘MAX ENTROPY ALL THE TIME’. Physics will still apply. Given that people survived and have lives worth writing stories about, we are at least within ‘critical failure’ distance of friendliness in AI. That narrows things very considerably.
A lot of the unpredictability of the singularity arises from a lack of proof of friendliness. One you’ve cleared that (or nearly), the range of possibilities isn’t singular in nature.
If there’s nothing I can write that wouldn’t break your Willing Suspension of Disbelief about events after an intelligence explosion, then there’s nothing I can write to do that, and nothing you can suggest to add to my story’s background; and both our times might be spent more productively (by our own standards) if we focus on our respective projects.
If you have a world where you can predict events after an intelligence explosion that intelligence explosion per definition isn’t a singularity event.
There are several working definitions for the term ‘singularity’. If the definition you use means that you think a story involving that word is inherently implausible, then one possibility would be to assume that where you see me write that term, I instead write, say, “That weird event where all the weird stuff happened that seemed a lot like what some of those skiffy authours used to call the ‘Singularity’”, or “Blamfoozle”, or anything else which preserves most of my intended meaning without forcing you to get caught up in this particular aspect of this particular word.