I don’t know how many people here would agree with the following, but my position on it is extreme relative to the mainstream, so I think it deserves a mention:
As a matter of individual rights as well as for a well working society, all information should be absolutely free; there should be no laws on the collection, distribution or use of information.
Copyright, Patent and Trademark law are forms of censorship and should be completely abolished. The same applies to laws on libel, slander and exchange of child pornography.
Information privacy is massively overrated; the right to remember, use and distribute valuable information available to a specific entity should always override the right of other entites not to be embarassed or disadvantaged by these acts.
People and companies exposing buggy software to untrusted parties deserve to have it exploited to their disadvantage. Maliciously attacking software systems by submitting data crafted to trigger security-critical bugs should not be illegal in any way.
Limits:
The last paragraph assumes that there are no langford basilisks; if such things do in fact exist, preventing basilisk deaths may justify censorship—based on the purely practical observation that fixing the human mind would likely not be possible shortly after discovery.
All of the stated policy opinions apply to societies composed of roughly human-intelligent people only; they break down in the presence of sufficiently intelligent entities.
In addition, if it was possible to significantly ameliorate existential risks by censorsing certain information, that would justify doing so—but I can’t come up with a likely case for that happening in practice.
Also, if you pile on technological improvements but still try to keep patents etc, you end up in the crazy situation where government intrusiveness has to grow without bounds and make hegemonic war on the universe to stop anyone, anywhere from popping a Rolex out of their Drexlerian assembler.
I very strongly agree, except for the matter of trademarks. Trademarks make brand recognition easier and reduce transaction costs. Also enforcing trademarks is more along the lines of preventing fraud, since trademarks are limited only in identifying items in specific classes of items (rather clumsily worded, but I’m trying to be concise and legalities don’t exactly lend themselves to concision.)
I don’t know how many people here would agree with the following, but my position on it is extreme relative to the mainstream, so I think it deserves a mention:
As a matter of individual rights as well as for a well working society, all information should be absolutely free; there should be no laws on the collection, distribution or use of information.
Normally, when people say they believe “all information should be free”, I suspect they don’t really mean this, but since you claim your position is very “extreme”, perhaps you really do mean it?
I think information, such as what is the PIN to my bank account, or the password to my LessWrong.com account, should not be freely accessible.
Information privacy is massively overrated; the right to remember, use and distribute valuable information available to a specific entity should always override the right of other entites not to be embarassed or disadvantaged by these acts.
You don’t believe there is value in anonymity? E.g. being able to criticize an oppressive government, without fear of retribution from said government?
I think information, such as what is the PIN to my bank account, or the password to my LessWrong.com account, should not be freely accessible.
You make a good point; I didn’t phrase my original statement as well as I should have. What I meant was that there shouldn’t be any laws (within the limits mentioned in my original post) preventing people or companies from using, storing and passing on information. I didn’t mean to imply keeping secrets should be illegal. If a person or company wants to keep something secret, and can manage to do so in practice, that should be perfectly legal as well.
As a special case, using encryption and keeping the keys to yourself should be a fundamental right, and doing so shouldn’t lead to e.g. a presumption of guilt in a legal case.
You don’t believe there is value in anonymity? E.g. being able to criticize an oppressive government, without fear of retribution from said government?
I believe there can be value in anonymity, but the way to achieve it is by effectively keeping a secret either through technological means or by communicating through trusted associates. If doing so is infeasible without laws on use of information, I don’t think laws would help, either.
I think governments that would like to be oppressive have significantly more to fear from free information use than their citizens do.
When you use the PIN to your bank account you expect both the bank and ATM technicians and programmers to respect your secret. There are laws that either force them not to remember the PIN or impose punishment for misusing their position of trust.
I don’t see how such situations or cases of blackmail would be resolved without assuming one person’s right to have their secrets not made public by others.
I’m not just nitpicking. I would love to see a watertight argument against communication perversions. Have you written anything on the topic?
I don’t agree with it. You can’t believe everything you read in Wired. The “information should be free” movement is just modern techno-geek Marxism, and it’s only sillier the second time around.
People and companies exposing buggy software to untrusted parties deserve to have it exploited to their disadvantage. Maliciously attacking software systems by submitting data crafted to trigger security-critical bugs should not be illegal in any way.
That may be so now, but that doesn’t mean it’s impossible to change it. That the current default state for software is “likely insecure” reflects the fact that the market price for software security is lower than the cost of providing it.
Laws against software attacks raise the cost of performing such attacks, and therefore lower the incentives for people to ensure the software they use is secure. I think it would be worth a try to take that illegality away, and see if the market responds by coming up with ways to make software secure.
You can’t get really good physical security without expending huge amounts of resources: physical security doesn’t scale well. Software security is different in principle: If you get it right, it doesn’t matter how many resources an attacker can get to try and subvert your system over a data channel—they won’t succeed.
I don’t know how many people here would agree with the following, but my position on it is extreme relative to the mainstream, so I think it deserves a mention:
As a matter of individual rights as well as for a well working society, all information should be absolutely free; there should be no laws on the collection, distribution or use of information.
Copyright, Patent and Trademark law are forms of censorship and should be completely abolished. The same applies to laws on libel, slander and exchange of child pornography.
Information privacy is massively overrated; the right to remember, use and distribute valuable information available to a specific entity should always override the right of other entites not to be embarassed or disadvantaged by these acts.
People and companies exposing buggy software to untrusted parties deserve to have it exploited to their disadvantage. Maliciously attacking software systems by submitting data crafted to trigger security-critical bugs should not be illegal in any way.
Limits: The last paragraph assumes that there are no langford basilisks; if such things do in fact exist, preventing basilisk deaths may justify censorship—based on the purely practical observation that fixing the human mind would likely not be possible shortly after discovery.
All of the stated policy opinions apply to societies composed of roughly human-intelligent people only; they break down in the presence of sufficiently intelligent entities.
In addition, if it was possible to significantly ameliorate existential risks by censorsing certain information, that would justify doing so—but I can’t come up with a likely case for that happening in practice.
Agreed.
Also, if you pile on technological improvements but still try to keep patents etc, you end up in the crazy situation where government intrusiveness has to grow without bounds and make hegemonic war on the universe to stop anyone, anywhere from popping a Rolex out of their Drexlerian assembler.
I very strongly agree, except for the matter of trademarks. Trademarks make brand recognition easier and reduce transaction costs. Also enforcing trademarks is more along the lines of preventing fraud, since trademarks are limited only in identifying items in specific classes of items (rather clumsily worded, but I’m trying to be concise and legalities don’t exactly lend themselves to concision.)
Isn’t yelling “fire!” in a crowded theater a kind of langford basilisk?
Normally, when people say they believe “all information should be free”, I suspect they don’t really mean this, but since you claim your position is very “extreme”, perhaps you really do mean it?
I think information, such as what is the PIN to my bank account, or the password to my LessWrong.com account, should not be freely accessible.
You don’t believe there is value in anonymity? E.g. being able to criticize an oppressive government, without fear of retribution from said government?
You make a good point; I didn’t phrase my original statement as well as I should have. What I meant was that there shouldn’t be any laws (within the limits mentioned in my original post) preventing people or companies from using, storing and passing on information. I didn’t mean to imply keeping secrets should be illegal. If a person or company wants to keep something secret, and can manage to do so in practice, that should be perfectly legal as well.
As a special case, using encryption and keeping the keys to yourself should be a fundamental right, and doing so shouldn’t lead to e.g. a presumption of guilt in a legal case.
I believe there can be value in anonymity, but the way to achieve it is by effectively keeping a secret either through technological means or by communicating through trusted associates. If doing so is infeasible without laws on use of information, I don’t think laws would help, either.
I think governments that would like to be oppressive have significantly more to fear from free information use than their citizens do.
When you use the PIN to your bank account you expect both the bank and ATM technicians and programmers to respect your secret. There are laws that either force them not to remember the PIN or impose punishment for misusing their position of trust. I don’t see how such situations or cases of blackmail would be resolved without assuming one person’s right to have their secrets not made public by others.
I’m not just nitpicking. I would love to see a watertight argument against communication perversions. Have you written anything on the topic?
Agreed.
I don’t agree with it. You can’t believe everything you read in Wired. The “information should be free” movement is just modern techno-geek Marxism, and it’s only sillier the second time around.
All software is buggy. All parties are untrusted.
That may be so now, but that doesn’t mean it’s impossible to change it. That the current default state for software is “likely insecure” reflects the fact that the market price for software security is lower than the cost of providing it.
Laws against software attacks raise the cost of performing such attacks, and therefore lower the incentives for people to ensure the software they use is secure. I think it would be worth a try to take that illegality away, and see if the market responds by coming up with ways to make software secure.
You can’t get really good physical security without expending huge amounts of resources: physical security doesn’t scale well. Software security is different in principle: If you get it right, it doesn’t matter how many resources an attacker can get to try and subvert your system over a data channel—they won’t succeed.