A sufficiently detailed record of a person’s behavior could be used to fully reconstruct their psychology. This might constitute a form of immortality, but even if it doesn’t, complete knowledge of the past would be a beautiful thing to have, it would make the future richer to know its past and to be able to bring back faithful replicas of the people who built it.
So this is an important question. To produce sufficiently detailed records… is that already happening by default? Is the ad industry already keeping all of that stuff? Will it all eventually make its way to light?
I’ve been assuming it’s happening. Storage is cheap. The data has lots of buyers.
If not, then I’m going to need to start recording and backing stuff up more thoroughly.
I don’t really know, but as I understand it, there are laws in Europe preventing companies from keeping data indefinitely. Also, ad companies might just be keeping the extracted insights they need. Downloading your chat logs seems really cheap, so seems to me worth the marginal cost in any case.
By the way, you might be interested in Lifelogging as life extension for more content on this topic.
What you have in mind is “A sufficiently detailed record of a person’s behavior when interacting with the computer/phone”
How is that sufficient to any reasonable degree?
What sorts of things, that you would want preserved, or that the future would find interesting, would not be captured by that?
Well, for one did you ever notice how people act differently in different situations? (for example among family, friends, work, acquaintances at the gym, or online) If you limit yourself to a single situation, there is not any person on earth that you could ‘reconstruct’ sufficiently well.
A minor example… I’m fairly sure you can make guesses about what kinds of expressions a person makes a lot from a few photos of their face. I’m not sure what else to point at to convey this intuition, but I seem to believe that behaviors in very different contexts leak information that’ll all become apparent with enough data.
I guess I can believe that there are probably a lot of people who don’t output enough content for this to work, maybe even among the users of this forum, but I don’t think it’s a large proportion of them.
Christ I hope not
I want the option of escaping from a rogue AI via cremation
If it helps, remember that there is a significant likelihood of you being in an ancestor simulation. You have no knowledge of what is outside the simulation, so it is entirely possible that regardless of your actions, you will be tortured for an up-notation amount of time upon death (or maybe literally forever, if the laws of physics/logic are different outside of the sim).
Thus, you shouldn’t be too stressed about destroying any information about yourself—it only makes a quantitative instead of a qualitative difference in terms of potential AI torture. That is, instead of (P-AI_Torture =0.01|No information destruction) and (P-AI_Torture =0.00 | Information destruction), it’s more something like (P-AI_Torture = O+0.01|No information destruction) and (P-AI_Torture = O | Information destruction), where O is the probability of the AI out of the sim torturing you anyways. I find this to be a more soothing way to think about the problem since it takes advantage of a few cognitive biases to make the importance of information destruction less emotionally critical.
I mean that doesn’t really have any relevance to the question of how I should think or act. Information destruction is exactly as important whether or not there is some chance you’re screwed anyway, by independence of irrelevant alternatives.