Neat idea! I’ve been thinking about things along this line for years because of science fiction writers like Charles Stross who wrote about the idea of having digital clones of yourself that you could spawn to act as assistants and do things like enter a simulation with a set of information to analyzed, and return a report after being run faster-than-realtime. For example, to analyze potentially infohazardous information sent by an untrusted party.
Also, the idea that people might assign their digital clones to go on a date, and then agree to go on a date if both their clones came back with a positive recommendation.
Of course, literally using a digital clone of yourself requires being rather cavalier about destroying conscious beings after you are done using them as a tool. Seems like it makes a lot more sense to use a non-conscious tool-AI without emotions for this sort of purpose.
Neat idea! I’ve been thinking about things along this line for years because of science fiction writers like Charles Stross who wrote about the idea of having digital clones of yourself that you could spawn to act as assistants and do things like enter a simulation with a set of information to analyzed, and return a report after being run faster-than-realtime. For example, to analyze potentially infohazardous information sent by an untrusted party.
Also, the idea that people might assign their digital clones to go on a date, and then agree to go on a date if both their clones came back with a positive recommendation.
Of course, literally using a digital clone of yourself requires being rather cavalier about destroying conscious beings after you are done using them as a tool. Seems like it makes a lot more sense to use a non-conscious tool-AI without emotions for this sort of purpose.