I fail to see why the shards have to be perfectly isolate, in this scenario. It would seem plausible that the AI could automatically import all the changes made by my best friend in his simulation into mine, and vice versa; and more extensively, include bits and pieces of other “real actions” into my ongoing narrative.
Ultimately, everyone in my universe could be “intermittently real” in proportion to which of their owned actions contributed to my utopia, and the rest of their screen time can be done by an AI stand-in that acted consistently with the way I like them to act. (For example, everyone on Twitter could be a real person in another simulation; me following them would start to leak their reality into mine).
This is sounding oddly familiar, but I can’t put my finger on why.
I fail to see why the shards have to be perfectly isolate, in this scenario. It would seem plausible that the AI could automatically import all the changes made by my best friend in his simulation into mine, and vice versa; and more extensively, include bits and pieces of other “real actions” into my ongoing narrative. Ultimately, everyone in my universe could be “intermittently real” in proportion to which of their owned actions contributed to my utopia, and the rest of their screen time can be done by an AI stand-in that acted consistently with the way I like them to act. (For example, everyone on Twitter could be a real person in another simulation; me following them would start to leak their reality into mine).
This is sounding oddly familiar, but I can’t put my finger on why.
This is somewhat similar to an idea I have called ‘culture goggles’ under which all interpersonal interactions go through a translation suite.