Any time you are making a potentially life-changing decision (e.g. following this career or that one, commit to a relationship or ending it), you can ask an AI to produce several simulations of yourself from 10 years later who made different decisions. Then you can discuss with them, or they can discuss with each other, so that you get a good idea of how each choice will personally change you—not just in a sense of pure stats (money made, etc), but in the sense of what sort of person you’re likely to be.
Heck, why restrict this to isolated life-changing decisions? I’d rather the AI assemble a party I can join whenever I wish, that is populated by a Dunbar-sized group of me from representatively sampled futures.
Cognitive weirdtopia:
Any time you are making a potentially life-changing decision (e.g. following this career or that one, commit to a relationship or ending it), you can ask an AI to produce several simulations of yourself from 10 years later who made different decisions. Then you can discuss with them, or they can discuss with each other, so that you get a good idea of how each choice will personally change you—not just in a sense of pure stats (money made, etc), but in the sense of what sort of person you’re likely to be.
Inspired by the “20 2020 Pennies” arc in the Penny&Aggie webcomic (ETA: which I discuss to a greater extent in a discussion post of its own ).
Heck, why restrict this to isolated life-changing decisions? I’d rather the AI assemble a party I can join whenever I wish, that is populated by a Dunbar-sized group of me from representatively sampled futures.
I automatically read “party I can join” in the RPG “adventuring party” sense. Uh-oh.
Well, for LRPG players, the two meanings aren’t completely disjoint.