Here and there, we see glimmers of some folks out there starting to get it. What this era is about. What it needs.
For example, Josh Klein, on Slate, offers a thoughtful rumination, Privacy isn’t a Right; It’s a Commodity, on how big companies are accessing and using our meta-data… and that this is only truly unfair if it remains a one-way street. He suggests that “privacy” isn’t the issue. It is how to enforce our rights and interests in benefiting from our own data.
Far more often, you find cases in which fine insights and great erudition culminate in… the most dreary of unimaginative conclusions, alas! Still, you take what you can get, these days, so let’s have a glance at one recent article that starts and continues brilliantly – laying down insights about the dilemmas of our age — before falling apart at the end, where the reader had hoped for cogent suggestions.
In the transcript of a speech, “Tradeoffs in Cyber Security,” Dan Geer – a computer security analyst and risk management specialist – offers up a paragraph redolent with insight and meaning, even extracted from his overall context:
“The essential character of a free society is this: That which is not forbidden is permitted. The essential character of an unfree society is the inverse, that which is not permitted is forbidden. The U.S. began as a free society without question; the weight of regulation, whether open or implicit, can only push it toward being unfree. Under the pressure to defend against offenders with a permanent structural advantage, defenders who opt for forbidding anything that is not expressly permitted are encouraging a computing environment that does not embody the freedom with which we are heretofore familiar.”
(Sharp readers may note this echoes a particular scene in EXISTENCE.)
Geer goes on to show the fundamental problem faced by anyone aiming to exert control, even control that aims for the safety and protection of the public: “Moore’s Law continues to give us two orders of magnitude in compute power per dollar per decade while storage grows at three orders of magnitude and bandwidth at four. These are top-down economic drivers. As such, the future is increasingly dense with stored data but, paradoxically, despite the massive growth of data volume, that data becomes more mobile with time.”
It is a very rich speech – idea-wise. Here’s another pungent paragraph:
“We are ever more a service economy, but every time an existing service disappears into the cloud, our vulnerability to its absence increases. Every time we ask the government to provide goodnesses that can only be done with more data, we are asking government to collect more data.
“Let me ask a yesterday question: How do you feel about traffic jam detection based on the handoff rate between cell towers of those cell phones in use in cars on the road? Let me ask a today question: How do you feel about auto insurance that is priced from a daily readout of your automobile’s black box? Let me ask a tomorrow question: In what calendar year will compulsory auto insurance be more expensive for the driver who insists on driving their car themselves rather than letting a robot do it? How do you feel about public health surveillance done by requiring Google and Bing to report on searches for cold remedies and the like? How do you feel about a Smart Grid that reduces your power costs and greens the atmosphere but reports minute-by-minute what is on and what is off in your home? Have you or would you install that toilet that does a urinalysis with every use?”
These snippets merely sample an extremely thought-provoking speech that merits close reading. Another example: “It is not heartless to say that if every human life is actually priceless, then it follows that there will never be enough money. One is not anti-government to say that doing a good job at preventing terrorism is better than doing a perfect job.”
Where Geer fails is toward the end. Having assembled many parts and perspectives of a daunting future, he disappoints with suggestions that amount to shrugs of “what’ch gonna do?”
Above all, Geer fails to seek out the intrinsic ways in which these zero-sum or negative-sum problems can be turned positive sum, by turning away from the paternalistic protection model, and back to one that worked for our predecessors, stretching back 300 years, who also had to deal with their own crises of expanding information. They resolved the problem by relying primarily on the robust resilience of distributed systems, especially those consisting of a knowing and empowered citizenry. In other words, lateral stability and resilience, versus vertical fragility.
That is intrinsically the basis for our enlightenment and every aspect of our social contract, and yet it is the last approach that most people — even smart ones — ever turn to. Least of all smug “heroes” like Julian Assange, who claim to have the Peoples’ interests at heart. Fundamentally, the message preached by Hollywood has taken root: do not expect anything from your fellow citizens. The only ones you could possibly rely on, over the long run.
Geer does refer glancingly to this possibility of a positive sum outcome from synergies of reciprocal and isotropic transparency… alas, only to dismiss it from mind. He starts by citing a sage who I must guess was my predecessor in this topic…
“Howard Brin was the first to suggest that if you lose control over what data is collected on you, the only freedom-preserving alternative is that if everyone else does, too. If the government or the corporation can surveil you without asking, then the balance of power is preserved when you can surveil them without asking. Bruce Schneier countered that preserving the balance of power doesn’t mean much if the effect of new information is non-linear, that is to say if new information is the exponent in an equation, not one more factor in a linear sum.”
== What does it all mean… Howard? ==
Um… either there’s a wiseguy out there with the same last name as me, who has said some smart things… or else this is the first time that I have ever been called “Howard!” Either way, it was honest of Geer to give this two-sentence nod to the alternative approach, the only alternative to his insightful, yet suggestion-free pessimism.
Alas, he goes on the cite Bruce Schneier’s shallow and refutable dismissal of sousveillance — the “exponent” incantation — while ignoring the obvious answer…
… that individual citizens can cluster. That they can join non-governmental organizations, like the ACLU and Electronic Frontier Foundation, pooling their dues and enabling such groups to hire top quality lawyers, top technical people. Moreover, such NGOs can also coalesce efforts and expertise from even wider arrays of volunteers, activists and tech-empowered smart mobs. (As I and some other authors portray happening ever-more in our future.) Indeed, such clusters can often rally support from foundations, companies… and even portions of government that are institutionally separated from the portions undergoing scrutiny.
One great way to enhance this effect might be to enact more substantial whistle-blower protection laws, plus philanthropist-funded “henchman’s prizes” that lure the revelation of heinous schemes. Worth noting — such methods could put an upper limit on the crucial product that gives conspirators their power — secretiveness times nastiness times monetary resources times the number of underlings doing their bidding. If that product is kept small enough, by suppressing some factors, then those NGOs will have a real chance, and Schneier’s entirely made-up “exponent” effect will be shown to be the chimera that it always was.
Indeed, my approach hearkens to the very fundamental trick of the Smithian branch of the enlightenment. To break up concentrations of power and to sic powerful elites against each other. If civil servants and corporations and the varied branches of the wealthy, and NGOs and the press and academics and so on can be kept from colluding — and incentivized to compete warily, then the powerful will leap upon each others’ malfeasances FOR us. This is not naivete, it is precisely the formula of three centuries. Moreover, snarkers who disdain this as utopian are not only unhelpful, they prove that they know nothing of the roots of their own civilization. The factors that enabled them to sit where they now reside, mostly-free, mostly knowing, and empowered to grouse and complain.
We can argue forever over details, e.g. whether agile, analytical and deliberative tools will actually produce smart mobs as capable as I portray in EXISTENCE. But the core point is this… not one of the grouches out there, whether brilliant as Geer or as sadly reflexive as Schneier, have ever once presented us with an alternative suggested recourse anywhere near as potentially effective as sousveillance and universal transparency.
Grudgingly, half-heartedly, they wind up proposing that we use the cleansing, invigorating tonic of light. Amid much grinding of teeth, they suggest revelatory moves of reciprocal accountability that more and more resemble…
== But then… signs of hope! ==
Oh, but one sees glimmers all over! After years of misquoting my works and attributing to me positions diametrically opposite to those I clearly stated in The Transparent Society (thus proving that he never even cracked open a copy of the book), it seems that at last “security expert” Bruce Schneier is starting to get the need for an open and accountable world. He still believes shrouds and secrecy can work for the common man, a charming naiveté. But in another recent piece, it seems that at last he now accepts we must aggressively look back at power.
In The Battle for Power on the Internet, Schneier discusses how cloud computing and tighter vendor control over operating systems is forcing users into constraints that were much looser in old PC days. “I have previously characterized this model of computing as “feudal.” Users pledge their allegiance to more powerful companies who, in turn, promise to protect them from both sysadmin duties and security threats. It’s a metaphor that’s rich in history and in fiction, and a model that’s increasingly permeating computing today.”
And: “It’s not all bad, of course. We, especially those of us who are not technical, like the convenience, redundancy, portability, automation, and shareability of vendor-managed devices. We like cloud backup. We like automatic updates. We like not having to deal with security ourselves. We like that Facebook just works — from any device, anywhere.”
— Solid stuff… that I have been saying for years. Schneier goes on to describe how technological advances first are exploited by the nimble — say “Robin Hoods” — but eventually become power-multipliers for the already ponderous but mighty entities like nation states and corporations.
Bruce then rises to exceptional cogency: “Transparency and oversight give us the confidence to trust institutional powers to fight the bad side of distributed power, while still allowing the good side to flourish. For if we’re going to entrust our security to institutional powers, we need to know they will act in our interests and not abuse that power. Otherwise, democracy fails.”
Will wonders never cease? Welcome back toward the light.
== And finally ==
Margaret Atwood provides a thorough and nuanced review of “The Circle” by Dave Eggers – a dystopian/utopian novel of the near future when a super version of Facebook collects all lives – mostly willingly – into a version of a Transparent Society. Mind you I don’t think things would work this way. Humans would insist on an equilibrium with more enforceable zones of privacy than the toilet and bedroom. Eggers is not describing humans.
Above all, and key to my argument, is that citizens empowered by transparency would be ABLE to push for such consensus reserves — realms to be left alone. Still, exaggeration is a common and effective literary technique. (In avoiding it, I may have hurt my commercial success!) I hope some of you will report back here what you think of this book. It is at-minimum a rumination that offers much for discussion.