Fogged
Amidst debates over privacy vs security, what can we learn from the model historically employed by gentlemen's clubs in the nineteenth century?
Image: CC-AT-SA Flickr: pit-yacker
The Reform Club, I read on its website, was founded as a counterweight to the Carlton Club, where Conservatives liked to meet and plot away from public scrutiny. To most of us, it's the club where Phileas Fogg made and won his bet that he could travel around the world in 80 days – no small feat in 1872.
On Wednesday, the Club played host to a load of people who don't usually talk to each other much because they come at issues of privacy from such different angles. Cityforum, the event's organizer, pulled together representatives from many parts of civil society, government security, and corporate and government researchers.
The key question: what trade-offs are people willing to make between security and privacy? Or between security and civil liberties? Or is "trade-off" the right paradigm? It was good to hear multiple people saying that the "zero-sum" attitude is losing ground to "proportionate". That is, the debate is moving on from viewing privacy and civil liberties as things we must trade away if we want to be secure to weighing the size of the threat against the size of the intrusion. The disproportion, for example, of local councils' usage of the anti-terrorism aspects of the Regulation of Investigatory Powers Act to check whether householders are putting out their garbage for collection on the wrong day, is clear to all.
It was when the topic of the social value of privacy was raised that it occurred to me that probably the closest model to what people really want lay in the magnificent building all around us. The gentlemen's club offered a social network restricted to "the right kind of people" – that is, people enough like you that they would welcome your fellow membership and treat you as you would wish to be treated. Within the confines of the club, a member like Fogg—who spent all day every day there—would have had, I imagine, little privacy from the other members or club staff, whose job it was to know what his favourite drink was and where and when he liked it served. But the club afforded members considerable protection from the outside world. Pause to imagine what Facebook would be like if the interface required each would-be addition to your friends list to be proposed and seconded and incomers could be black-balled by the people already on your list.
This sort of web of trust is the structure the cryptography software PGP relies on for authentication: when you generated your public key, you were supposed to have it signed by as many people as you could. Whenever someone wanted to verify the key, they could look at the list of who had signed it for someone they themselves knew and could trust. The big question with such a structure is how you make managing it scale to a large population. Things are a lot easier when it's just a small, relatively homogeneous group you have to deal with. And, I suppose, when you have staff to support the entire enterprise.
We talk a lot about the risks of posting too much information to things like Facebook, but that may not be its biggest issue. Just as traffic data can be more revealing than the content of messages, complex social linkages make it impossible to anonymise databases: who your friends are may be more revealing than your interactions with them. As governments and corporations talk more and more about making "anonymised" data available for research use, this will be an increasingly large issue. An example is a little-known incident in 2005, when the database of a month's worth of UK telephone calls was exported to the US with individuals' phone numbers hashed to "anonymise" them. An interesting technological fix comes from Microsoft with the notion of differential privacy – a system for protecting databases both against current re-identification and attacks with external data in the future. The catch, if it is one, is that you must assign to your database a sort of query budget in advance – and when it's used up, you must burn the database because it can no longer be protected.
Public opinion polls are a crude tool for measuring what privacy intrusions people will actually put up with in their daily lives. A study by Rand Europe released late last year attempted to examine such things by framing them in economic terms. The good news is they found that you'd have to pay people £19 to get them to agree to provide a DNA sample to include in their passport. The weird news is that people would pay £7 to include their fingerprints. You have to ask: what pitch could Rand possibly have made that would make this seem worth even one penny to anyone? By contrast, we also know the price people are willing to pay for club membership.
Hm. Fingerprints in my passport or a walk across a beautiful, mosaic floor to a fine meal in a room with Corinthian columns, 25-foot walls of books, and a staff member who politely fails to notice that I have not quite conformed to the dress code? I know which is worth paying for if you can afford it.
Wendy M. Grossman’s website has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.
Share this article
Comments
Latest Articles
Featured Article
Schmidt Happens
Wendy M. Grossman responds to "loopy" statements made by Google Executive Chairman Eric Schmidt in regards to censorship and encryption.
ORGZine: the Digital Rights magazine written for and by Open Rights Group supporters and engaged experts expressing their personal views
People who have written us are: campaigners, inventors, legal professionals , artists, writers, curators and publishers, technology experts, volunteers, think tanks, MPs, journalists and ORG supporters.
Comments (0)