Either this is self-contradictory, or it means ‘never be wrong’.
I think if you are making up your mind on unsettled empirical questions, you are a bad Bayesian. You can certainly make decisions under uncertainty, but you shouldn’t make up your mind. And anyways, I am not even sure how to assign priors for the upload fidelity questions.
In that case then you’re the one who made the jump from ‘goes against consensus’ to ‘this was assigned 0 probability’. If we all agreed that some proposition was 0.0001% likely, then claiming that this proposition is true would seem to me to be going against consensus.
Ok, what exactly is your posterior belief that uploads are possible? What would you say the average LW posterior belief of same? Where did this number come from? How much ‘cognitive effort’ is spent at LW thinking about the future where uploads are possible vs the future where uploads are not possible?
To answer the last question first—not a heck of a lot, but some. It was buried in an ‘impossible possible world’, but lack of uploading was not what made it the impossible possible world, so that doesn’t mean that it’s considered impossible.
To answer your questions:
-- Somewhere around 99.5% that it’s possible for me. The reasons for it to be possible are pretty convincing.
-- I would guess that the median estimate of likelihood among active posters who even have an estimate would be above 95%, but that’s a pretty wild guess. Taking the average would probably amount to a bit less than the fraction of people who think it’ll work, so that’s not very meaningful. My estimate of that is rough—I checked the survey, but the most applicable question was cryonics, and of course cryonics can be a bad idea even if uploading is possible (if you think that you’ll end up being thawed instead of uploaded) And of course if you somehow think you could be healed instead of uploaded, it could go the other way. 60% were on the fence or in favor of getting cryonically preserved, which means they think that the total product of the cryo Drake equation is noticeable. Most cryo discussions I’ve seen here treat organization as the main problem, which suggests that a majority consider recovery a much less severe problem. Being pessimistic for a lower bound on that gives me 95%.
-- The most likely to fail part of uploading is the scanning. Existing scanning technology can take care of anything as large as a dendrite (though in an unreasonably large amount of time). So, for uploading to be impossible, it would have to require either dynamical features or features which would necessarily be destroyed by any fixing process, and no other viable mechanism.
The former seems tremendously unlikely because personality can recover from some pretty severe shocks to the system like electrocution, anaerobic metabolic stasis, and inebriation (or other neurotoxins). I’d say that there being some relevant dynamical process that contains crucial nondeducible information is maybe 1 in 100 000, ballpark. Small enough that it’s not significant.
The latter seems fairly unlikely as well—if plastination or freezing erases some dendritic state, and that encodes personality information. Seems very unlikely indeed that there’s literally no way around this at all—no choice of means of fixing will work possible within the laws of physics. Maybe one in 20 that we can’t recover that state… and maybe one in 20 that it was vital to determine long-term psychological features (for the reasons outlined above, though weakened since we’re allowing that this is not transient, just fragile). Orders of magnitude, here.
Certainly, our brains are far larger than they need to be, and so it seems like you’re not going to run into the limits of physics. Heisenberg is irrelevant, and the observer effect won’t come and bite you at full strength because you have probes much less energetic than the cells in question. If nothing else, you should be able to insinuate something into the brain and measure it that way.
But of course I could have screwed up my reasoning, which accounts for the rest of the 0.5%. Maybe our brains are sufficiently fragile that you’re going to lose a lot when you poke it hard enough to get the information out. I doubt it to the tune of 199:1. As a check, I would feel comfortable taking a 20:1 bet on the subject, and not comfortable with a 2000:1 bet on it.
~~~~
Of course, the real reason that we don’t talk too much about what happens if uploading isn’t possible is that that would just make the future that much more like the present. We know how to deal with living in meat bodies already. If it works out that that’s the way we’re stuck, then, well, I guess we don’t need to worry about em catastrophes, and any FAI will really want to work on its biotech.
Ok—thanks for a detailed response. To be honest, I think you are quibbling. If your posterior is 99.5% and 95% if being pessimistic you made up your mind essentially as far as a mind can be made up in practice. If the answer to the upload question depends on an empirical test that has not yet been done (because of lack of tech), then you made up your mind too soon.
Of course, the real reason that we don’t talk too much about what happens if uploading isn’t
possible is that that would just make the future that much more like the present.
I think a cynic would say you talk about the upload future more because its much nicer (e.g. you can conquer death!)
If your posterior is 99.5% and 95% if being pessimistic you made up your mind essentially as far as a mind can be made up in practice. If the answer to the upload question depends on an empirical test that has not yet been done (because of lack of tech), then you made up your mind too soon.
These two statements clash very strongly. VERY strongly.
If you can predict the outcome of the empirical test with that degree of confidence or a higher one, then they’re perfectly compatible. We’re talking what’s physically possible with any plan of action and physically possible capabilities, not merely what can be done with today’s tech. The negative you’re pushing is actually a very very strong nonexistence statement.
I think if you are making up your mind on unsettled empirical questions, you are a bad Bayesian. You can certainly make decisions under uncertainty, but you shouldn’t make up your mind. And anyways, I am not even sure how to assign priors for the upload fidelity questions.
In that case then you’re the one who made the jump from ‘goes against consensus’ to ‘this was assigned 0 probability’. If we all agreed that some proposition was 0.0001% likely, then claiming that this proposition is true would seem to me to be going against consensus.
Ok, what exactly is your posterior belief that uploads are possible? What would you say the average LW posterior belief of same? Where did this number come from? How much ‘cognitive effort’ is spent at LW thinking about the future where uploads are possible vs the future where uploads are not possible?
To answer the last question first—not a heck of a lot, but some. It was buried in an ‘impossible possible world’, but lack of uploading was not what made it the impossible possible world, so that doesn’t mean that it’s considered impossible.
To answer your questions:
-- Somewhere around 99.5% that it’s possible for me. The reasons for it to be possible are pretty convincing.
-- I would guess that the median estimate of likelihood among active posters who even have an estimate would be above 95%, but that’s a pretty wild guess. Taking the average would probably amount to a bit less than the fraction of people who think it’ll work, so that’s not very meaningful. My estimate of that is rough—I checked the survey, but the most applicable question was cryonics, and of course cryonics can be a bad idea even if uploading is possible (if you think that you’ll end up being thawed instead of uploaded) And of course if you somehow think you could be healed instead of uploaded, it could go the other way. 60% were on the fence or in favor of getting cryonically preserved, which means they think that the total product of the cryo Drake equation is noticeable. Most cryo discussions I’ve seen here treat organization as the main problem, which suggests that a majority consider recovery a much less severe problem. Being pessimistic for a lower bound on that gives me 95%.
-- The most likely to fail part of uploading is the scanning. Existing scanning technology can take care of anything as large as a dendrite (though in an unreasonably large amount of time). So, for uploading to be impossible, it would have to require either dynamical features or features which would necessarily be destroyed by any fixing process, and no other viable mechanism.
The former seems tremendously unlikely because personality can recover from some pretty severe shocks to the system like electrocution, anaerobic metabolic stasis, and inebriation (or other neurotoxins). I’d say that there being some relevant dynamical process that contains crucial nondeducible information is maybe 1 in 100 000, ballpark. Small enough that it’s not significant.
The latter seems fairly unlikely as well—if plastination or freezing erases some dendritic state, and that encodes personality information. Seems very unlikely indeed that there’s literally no way around this at all—no choice of means of fixing will work possible within the laws of physics. Maybe one in 20 that we can’t recover that state… and maybe one in 20 that it was vital to determine long-term psychological features (for the reasons outlined above, though weakened since we’re allowing that this is not transient, just fragile). Orders of magnitude, here.
Certainly, our brains are far larger than they need to be, and so it seems like you’re not going to run into the limits of physics. Heisenberg is irrelevant, and the observer effect won’t come and bite you at full strength because you have probes much less energetic than the cells in question. If nothing else, you should be able to insinuate something into the brain and measure it that way.
But of course I could have screwed up my reasoning, which accounts for the rest of the 0.5%. Maybe our brains are sufficiently fragile that you’re going to lose a lot when you poke it hard enough to get the information out. I doubt it to the tune of 199:1. As a check, I would feel comfortable taking a 20:1 bet on the subject, and not comfortable with a 2000:1 bet on it.
~~~~
Of course, the real reason that we don’t talk too much about what happens if uploading isn’t possible is that that would just make the future that much more like the present. We know how to deal with living in meat bodies already. If it works out that that’s the way we’re stuck, then, well, I guess we don’t need to worry about em catastrophes, and any FAI will really want to work on its biotech.
Ok—thanks for a detailed response. To be honest, I think you are quibbling. If your posterior is 99.5% and 95% if being pessimistic you made up your mind essentially as far as a mind can be made up in practice. If the answer to the upload question depends on an empirical test that has not yet been done (because of lack of tech), then you made up your mind too soon.
I think a cynic would say you talk about the upload future more because its much nicer (e.g. you can conquer death!)
These two statements clash very strongly. VERY strongly.
They don’t. 99.5% is far too much.
If you can predict the outcome of the empirical test with that degree of confidence or a higher one, then they’re perfectly compatible. We’re talking what’s physically possible with any plan of action and physically possible capabilities, not merely what can be done with today’s tech. The negative you’re pushing is actually a very very strong nonexistence statement.