Wei_Dai wasn’t saying there was. He was supposing that conventional brain scans produce data which is entangled with your particular brain; and that it’s possible a sufficient number of such scans could enable a future superintelligence to reconstruct you to a sufficient level of fidelity. If FAI is coming but I buy the farm first, I would prefer lots of MRIs + a frozen brain to just a frozen brain; and I would prefer lots of MRIs to nothing at all.
edit: I do, of course, think the entangling is weak enough that 10GB of head MRIs << 10GB of mind state.
Sure but these things cost money and we have finite resources. A dedicated fMRI machine would cost somewhere between 2-6 times SIAI’s annual expenditures.
That’s a separate argument than the one over whether such a scan is possible with present technology. I do agree that an fMRI machine shouldn’t be a budget priority for SI; I also do not consider it worth my money to get frequent MRIs (although I have saved the one I got for other reasons last year). If I were sufficiently wealthy, though, I’d buy frequent brain scans before I’d buy, say, a Ferrari (and I do really enjoy fast, flashy cars).
Such a scan isn’t possible with present technology. What is possible are scans and other methods of recording information that could conceivably be relevant to an attempt to reconstruct your mind at a future time. If the argument is simply that brain scans “couldn’t hurt”… well, ‘duh’. But making a diary of your frequent thoughts or videotaping yourself as you go about your day couldn’t hurt either. Knowing the regional blood-flow patterns in your brain in response to narrow and limited stimuli is not in a significantly different category.
The question is whether the cost and time involved in these endeavors is is better than plausible counter-factual spending. My point isn’t just that SI shouldn’t purchase a machine; it’s that giving to existential risk research, or ensuring the financial stability of your cryonics organization probably has a better return for your own long-term survival than getting frequent fMRI scans does. The point that spending money on brain scans has a better return than an expenditure that probably lowers your long-term survival rate (due to car accidents) is not a strong argument.
Wei_Dai wasn’t saying there was. He was supposing that conventional brain scans produce data which is entangled with your particular brain; and that it’s possible a sufficient number of such scans could enable a future superintelligence to reconstruct you to a sufficient level of fidelity. If FAI is coming but I buy the farm first, I would prefer lots of MRIs + a frozen brain to just a frozen brain; and I would prefer lots of MRIs to nothing at all.
edit: I do, of course, think the entangling is weak enough that 10GB of head MRIs << 10GB of mind state.
Sure but these things cost money and we have finite resources. A dedicated fMRI machine would cost somewhere between 2-6 times SIAI’s annual expenditures.
That’s a separate argument than the one over whether such a scan is possible with present technology. I do agree that an fMRI machine shouldn’t be a budget priority for SI; I also do not consider it worth my money to get frequent MRIs (although I have saved the one I got for other reasons last year). If I were sufficiently wealthy, though, I’d buy frequent brain scans before I’d buy, say, a Ferrari (and I do really enjoy fast, flashy cars).
Such a scan isn’t possible with present technology. What is possible are scans and other methods of recording information that could conceivably be relevant to an attempt to reconstruct your mind at a future time. If the argument is simply that brain scans “couldn’t hurt”… well, ‘duh’. But making a diary of your frequent thoughts or videotaping yourself as you go about your day couldn’t hurt either. Knowing the regional blood-flow patterns in your brain in response to narrow and limited stimuli is not in a significantly different category.
The question is whether the cost and time involved in these endeavors is is better than plausible counter-factual spending. My point isn’t just that SI shouldn’t purchase a machine; it’s that giving to existential risk research, or ensuring the financial stability of your cryonics organization probably has a better return for your own long-term survival than getting frequent fMRI scans does. The point that spending money on brain scans has a better return than an expenditure that probably lowers your long-term survival rate (due to car accidents) is not a strong argument.