If someone tried to implement this in real life, I would expect it to get implemented exactly halfway. I would expect to find out that my life became perfectly transparent for anyone who cares, but there would be some nice-sounding reason why the people at the top of the food chain would retain their privacy. (National security. Or there are a few private islands in the ocean where the surveillance is allegedly economically/technically impossible to install, and by sheer coincidence, the truly important people live there.) I would also expect this asymmetry to be abused against people who try to organize to remove it.
You know, just like those cops wearing body cams that mysteriously stop functioning exactly at the moment the recording could be used against them. That, but on a planetary scale.
From the opposite perspective, many people would immediately think about counter-measures. Secret languages; so that you can listen to me talking to my friends, but still have no idea what was the topic. This wouldn’t scale well, but some powerful and well-organized groups would use it.
People would learn to be more indirect in their speech, to allow everyone to pretend that anything was a coincidence or misunderstanding. There would be a lot of guessing, and people on the autism spectrum would be at a serious disadvantage.
How would the observed data be evaluated? People are hypocrites; just because you are doing the same thing many other people are doing, and everyone can see it, it doesn’t necessarily prevent the outcome where you get punished and those other people not. People are really good at being dumb when you provide them evidence they don’t want to see. Not understanding things you can clearly see would become even more important social skill. There would still be taboos, and you would not be able to talk about them; not even in privacy, because that wouldn’t exist anymore.
But for the people who believe this would be great… I would recommend trying the experiment on a smaller scale. To create a community of volunteers, who would install surveillance throughout their commune, accessible to all members of the commune. What would happen next?
The whole point of the book is that the failure mode you envision is going to happen by default. It is not a risk of inverse surveillance because it is already happening.
There is a problem that surveillance increases continuously, not in an abrupt step. At some point we must establish a norm that police turning off their cameras is a crime. The public had no trouble condemning Nixon for his 18 minute gap. But at the moment many police camera systems require positive steps of activation and downloading which have plausible deniability of having just forgot.
Strong upvoted and would add that we currently live in a world where surveillance is much more common than inverse surveillance, so proponents of a transparent society should, AFAICT, be much more focused on increasing inverse surveillance than surveillance at the moment.
I would expect it to get implemented exactly halfway
Not stopping halfway is a crucial part of the proposal. If they stop halfway, that is not the thing I have proposed. If an attempt somehow starts in earnest then fails partway through, policy should be that the whole thing should be rolled back and undone completely.
Regarding the difficulty of sincerely justifying opening National Security… That’s going to depend on the outcome of the wargames.. I can definitely imagine an outcome that gets us the claim “Not having secret services is just infeasible” in which case I’m not sure what I’d do. Might end up dropping the idea entirely. It would be painful.
allegedly economically/technically impossible to install
Not plausible if said people are rich and the hardware is cheap enough for the scheme to be implementable at all. There isn’t an excuse like that. Maybe they could say something about being an “offline community” and not having much of a network connection.. but the data could just be stored in a local buffer somewhere. They’d be able to arrange a temporary disconnection, get away with some things, one time, I suppose, but they’d have to be quick about it.
From the opposite perspective, many people would immediately think about counter-measures. Secret languages
Obvious secret languages would be illegal. It’s exactly the same crime as brazenly covering the cameras or walking out of their sight (without your personal drones). I am very curious about the possibilities of undetectable secrecy, but there are reasons to think it would be limited.
I would recommend trying the experiment on a smaller scale. To create a community of volunteers, who would install surveillance throughout their commune, accessible to all members of the commune. What would happen next?
(Hmm… I can think of someone in particular who really would have liked to live in that sort of situation, she would have felt a lot safer… ]:)
One of my intimates has made an attempt at this. It was inconclusive. We’d do it again.
But it wouldn’t be totally informative. We probably couldn’t justify making the data public, so we wouldn’t have to deal much the omniscient antagonists thing, and the really difficult questions wouldn’t end up getting answered.
One relevant small-scale experiment would be Ray Dalio’s hedge fund Bridgewater, I believe they practice a form of (internal) radical openness, cameras and all. His book is on my reading list.
I would one day like to create an alternative to secure multiparty computation schemes like Ethereum by just running a devoutly radically transparent (panopticon accessible to external parties) webhosting service on open hardware. It would seem a lot simpler. Auditing, culture and surveillance as an alternative to these very heavy, quite constraining crypto technologies. The integrity of the computations wouldn’t be mathematically provable, but it would be about as indisputable as the moon landing.
It’s conceivable that this would always be strictly more useful than any blockchain world-computer, as far as I’m aware we need a different specific secure multiparty comptuation technique every time we want to find a way to compute on hidden information. For a radically transparent webhost, the incredible feat of arbitrary computation on hidden data at near commodity hardware efficiency (fully open, secure hardware is unlikely to be as fast as whatever intel’s putting out, but it would be in the same order of magnitude) would require only a little bit of additional auditing.
If someone tried to implement this in real life, I would expect it to get implemented exactly halfway. I would expect to find out that my life became perfectly transparent for anyone who cares, but there would be some nice-sounding reason why the people at the top of the food chain would retain their privacy. (National security. Or there are a few private islands in the ocean where the surveillance is allegedly economically/technically impossible to install, and by sheer coincidence, the truly important people live there.) I would also expect this asymmetry to be abused against people who try to organize to remove it.
You know, just like those cops wearing body cams that mysteriously stop functioning exactly at the moment the recording could be used against them. That, but on a planetary scale.
From the opposite perspective, many people would immediately think about counter-measures. Secret languages; so that you can listen to me talking to my friends, but still have no idea what was the topic. This wouldn’t scale well, but some powerful and well-organized groups would use it.
People would learn to be more indirect in their speech, to allow everyone to pretend that anything was a coincidence or misunderstanding. There would be a lot of guessing, and people on the autism spectrum would be at a serious disadvantage.
How would the observed data be evaluated? People are hypocrites; just because you are doing the same thing many other people are doing, and everyone can see it, it doesn’t necessarily prevent the outcome where you get punished and those other people not. People are really good at being dumb when you provide them evidence they don’t want to see. Not understanding things you can clearly see would become even more important social skill. There would still be taboos, and you would not be able to talk about them; not even in privacy, because that wouldn’t exist anymore.
But for the people who believe this would be great… I would recommend trying the experiment on a smaller scale. To create a community of volunteers, who would install surveillance throughout their commune, accessible to all members of the commune. What would happen next?
The whole point of the book is that the failure mode you envision is going to happen by default. It is not a risk of inverse surveillance because it is already happening.
There is a problem that surveillance increases continuously, not in an abrupt step. At some point we must establish a norm that police turning off their cameras is a crime. The public had no trouble condemning Nixon for his 18 minute gap. But at the moment many police camera systems require positive steps of activation and downloading which have plausible deniability of having just forgot.
Strong upvoted and would add that we currently live in a world where surveillance is much more common than inverse surveillance, so proponents of a transparent society should, AFAICT, be much more focused on increasing inverse surveillance than surveillance at the moment.
Not stopping halfway is a crucial part of the proposal. If they stop halfway, that is not the thing I have proposed. If an attempt somehow starts in earnest then fails partway through, policy should be that the whole thing should be rolled back and undone completely.
Regarding the difficulty of sincerely justifying opening National Security… That’s going to depend on the outcome of the wargames.. I can definitely imagine an outcome that gets us the claim “Not having secret services is just infeasible” in which case I’m not sure what I’d do. Might end up dropping the idea entirely. It would be painful.
Not plausible if said people are rich and the hardware is cheap enough for the scheme to be implementable at all. There isn’t an excuse like that. Maybe they could say something about being an “offline community” and not having much of a network connection.. but the data could just be stored in a local buffer somewhere. They’d be able to arrange a temporary disconnection, get away with some things, one time, I suppose, but they’d have to be quick about it.
Obvious secret languages would be illegal. It’s exactly the same crime as brazenly covering the cameras or walking out of their sight (without your personal drones). I am very curious about the possibilities of undetectable secrecy, but there are reasons to think it would be limited.
(Hmm… I can think of someone in particular who really would have liked to live in that sort of situation, she would have felt a lot safer… ]:)
One of my intimates has made an attempt at this. It was inconclusive. We’d do it again.
But it wouldn’t be totally informative. We probably couldn’t justify making the data public, so we wouldn’t have to deal much the omniscient antagonists thing, and the really difficult questions wouldn’t end up getting answered.
One relevant small-scale experiment would be Ray Dalio’s hedge fund Bridgewater, I believe they practice a form of (internal) radical openness, cameras and all. His book is on my reading list.
I would one day like to create an alternative to secure multiparty computation schemes like Ethereum by just running a devoutly radically transparent (panopticon accessible to external parties) webhosting service on open hardware. It would seem a lot simpler. Auditing, culture and surveillance as an alternative to these very heavy, quite constraining crypto technologies. The integrity of the computations wouldn’t be mathematically provable, but it would be about as indisputable as the moon landing.
It’s conceivable that this would always be strictly more useful than any blockchain world-computer, as far as I’m aware we need a different specific secure multiparty comptuation technique every time we want to find a way to compute on hidden information. For a radically transparent webhost, the incredible feat of arbitrary computation on hidden data at near commodity hardware efficiency (fully open, secure hardware is unlikely to be as fast as whatever intel’s putting out, but it would be in the same order of magnitude) would require only a little bit of additional auditing.