Wendy M Grossman looks at the issue of privacy policies, and suggests that the system must be 'fixed' in order for users to completely understand what information they are signing away.
The modern usability movement as it applies to computer software and hardware design began in 1988 when Donald Norman published The Design of Everyday Things. Norman, as he's patiently retold many times since, was inspired to write that book by six frustrating months in England, where he was constantly maddened because nothing, not even light switches, worked logically. His most recent book, Living with Complexity, looked at the design of complex systems, trying to pinpoint how to make the services we navigate every day less frustrating.
I was thinking of this recently, when the Open Rights Group hosted a meeting on the mid-May Sunday Times story that mobile network operator and ISP EE was sharing detailed customer data with the market survey company Ipsos Mori. EE and Ipsos Mori sent representatives, as did the Information Commissioner's Office. Essentially, they said a small pilot project had been misunderstood.
Privacy is a complicated issue because even experts do not have good answers to questions like how big a risk over what period of time is posed by the disclosure of a particular set of data. We know this much: today's "anonymized" data is tomorrow's reidentified data as more and more datasets come online to help triangulate it, much the way today's strong cryptography will be weaker tomorrow as computational power continues to grow. The ability to make accurate assessments is complicated by unknown externalities. How many users remember what they posted under which terms and conditions five years ago? And users themselves have varying understanding of what they think is happening.
We were into privacy policies and user consent, when I began to imagine what these might look like under a more stringent data protection law. It will be like today's omnipresent cookie authorization requests? Click OK to post this data. Click OK to share this data with our partner who just wants to sell you stuff. Click OK to let us reuse this data to personalize the video on the billboard you're about to pass. Click OK to…you mean, you didn't want to send your personal data to the US National Security Agency?
I am not suggesting we fix the users. The users aren't broken. Fix the *systems*.
The problem, someone pointed out to me afterwards, is that a lot of people think that their government knows everything about everyone anyway. But there's a big difference between that casual cynicism and seeing proof. Right on cue, the next day's newspaper headlines. The Guardian and the Washington Post say that under a previously unknown program called PRISM the NSA has direct access to the systems of US-based companies: Facebook, Google, Apple, AOL, Skype, PalTalk, and YouTube. (A number of these companies are quoted denying they have given such access.) Direct access as in, walk right in and pick the data they want. Also: the NSA is collecting the phone records of millions of customers of Verizon, one of the biggest US telcos. And: the UK's GCHQ has had access since 2010.
Worse, US government politicians are defending it: Democratic senators Harry Reid (Nevada) and Dianne Feinstein (California in the Wall Street Journal, President Obama in the Guardian. Charles Arthur has a helpful and rational decoding of all this and Nick Hopkins explains the UK's legal situation with respect to phone records.
At Computers, Privacy, and Data Protection earlier this year, the long-time privacy activist Caspar Bowden discussed the legal and technical framework for surveillance-as-a-service and the risks for EU users of cloud computing (which includes social media sites). Essentially, if there is a back door installed in these systems, "interception" is no longer a useful concept, and encryption is no longer a useful defense. Inside those data centers, data is perforce decrypted, and legally authorized direct access to stored uploaded data under the Foreign Intelligence Amendments Act (since the Fourth Amendment does not protect non-US persons) is not interception of communications.
Before the Internet, it was pretty simple to avoid being surveilled by a foreign country: you just didn't go there. So the first thing we need to make explicit in users' mental models is that uploading photographs and personal data to sites like Google and Facebook is digitally entering the US. We could start maybe by requiring large pictures of the services' national flag.
Wendy M. Grossman’s Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted irregularly to the net.wars Pinboard - or follow on Twitter.
Share this article
Wendy M. Grossman responds to "loopy" statements made by Google Executive Chairman Eric Schmidt in regards to censorship and encryption.
ORGZine: the Digital Rights magazine written for and by Open Rights Group supporters and engaged experts expressing their personal views
People who have written us are: campaigners, inventors, legal professionals , artists, writers, curators and publishers, technology experts, volunteers, think tanks, MPs, journalists and ORG supporters.