Well, the problem with the former (knowledge harbored within the brain) is that it’s very vague and hard to define.
If I have, say, a method to improve the efficacy of VX (an easily weaponizable nerve toxin). As a utilitarian I conclude this information is going to be harmful, I can purge it of my hard-drive, I can burn the papers I used to come up with this… etc.
But I can’t wipe my head clean of the information, at best I can resign to never talk about it to anyone and to not accord it much import, such that I may forget it. But that’s not destruction per-say, it’s closer to lying, not sharing the information with anyone (even if asked specifically), or to biasing your brain towards transmitting and remembering certain pieces of information (which we do all the time).
However I don’t see anything contentious with this case, nor with any other case of information-destruction, as long as it is for the greater utility.
I think in general people don’t advocate for destroying/forgetting information because:
a) It’s hard to do
b) As a general rule of thumb the accumulation of information seems to be a good thing, even if the utility of a specific piece of information is not obvious
But this is more of a heuristic, an exact principle.
I’d agree that the first one is generally pretty separated from common reality, but think it’s a useful thought experiment.
I was originally thinking of this more in terms of “removing useful information” than “removing expected-harmful information”, but good point; the latter could be interesting too.
Well,I think the “removing useful information” bit contradicts with utility to being with.
As in, if you are a utilitarian, useful information == helps maximize utility. Thus the trade-off is not possible.
I can think of some contrived examples where the trade-off is possible (e.g. where the information is harmful now but will be useful later), but in that case it’s so easy to “hide” information in the modern age, instead of destroying it entirely, that the problem seem too theoretical to me.
But at the end of the day, assuming you reached a contrived enough situation where the information must either be destroyed (or where hiding it devoid other people of the ability to discover further useful information), I think the utilitarian perspective has nothing fundamental against destroying it. However, no matter how hard I try, I can’t really think of a very relevant example where this could be the case.
One extreme case would be committing suicide because your secret is that important.
A less extreme case may be being OK with forgetting information; you’re losing value, but the cost to maintain it wouldn’t be worth it. (In this case the information is positive though)
Well, the problem with the former (knowledge harbored within the brain) is that it’s very vague and hard to define.
If I have, say, a method to improve the efficacy of VX (an easily weaponizable nerve toxin). As a utilitarian I conclude this information is going to be harmful, I can purge it of my hard-drive, I can burn the papers I used to come up with this… etc.
But I can’t wipe my head clean of the information, at best I can resign to never talk about it to anyone and to not accord it much import, such that I may forget it. But that’s not destruction per-say, it’s closer to lying, not sharing the information with anyone (even if asked specifically), or to biasing your brain towards transmitting and remembering certain pieces of information (which we do all the time).
However I don’t see anything contentious with this case, nor with any other case of information-destruction, as long as it is for the greater utility.
I think in general people don’t advocate for destroying/forgetting information because:
a) It’s hard to do
b) As a general rule of thumb the accumulation of information seems to be a good thing, even if the utility of a specific piece of information is not obvious
But this is more of a heuristic, an exact principle.
I’d agree that the first one is generally pretty separated from common reality, but think it’s a useful thought experiment.
I was originally thinking of this more in terms of “removing useful information” than “removing expected-harmful information”, but good point; the latter could be interesting too.
Well,I think the “removing useful information” bit contradicts with utility to being with.
As in, if you are a utilitarian, useful information == helps maximize utility. Thus the trade-off is not possible.
I can think of some contrived examples where the trade-off is possible (e.g. where the information is harmful now but will be useful later), but in that case it’s so easy to “hide” information in the modern age, instead of destroying it entirely, that the problem seem too theoretical to me.
But at the end of the day, assuming you reached a contrived enough situation where the information must either be destroyed (or where hiding it devoid other people of the ability to discover further useful information), I think the utilitarian perspective has nothing fundamental against destroying it. However, no matter how hard I try, I can’t really think of a very relevant example where this could be the case.
One extreme case would be committing suicide because your secret is that important.
A less extreme case may be being OK with forgetting information; you’re losing value, but the cost to maintain it wouldn’t be worth it. (In this case the information is positive though)