Today is a day of action and protest against mass surveillance by our governments: The Day We Fight Back.
Aptly, the first thing in my Twitter stream when I opened my laptop this morning was the inventor of the Web re-tweeting the Electronic Frontier Foundation’s message…
Things move quickly, our lives are busy. Today, though, take the opportunity of The Day We Fight Back to stop and ask – what kind of a world do you want to live in? What future do we want to choose? What kind of Web do we want to live with around us?
Then do one thing – or a few things – to mark your intent, to take small steps toward the future you want.
Learning more about how to take control of my data and security personally – by attending a Cryptoparty this evening where people will be learning how to lock down their devices and manage their online data. I’m hoping to help hold more of these for friends, colleagues and clients in the future.
Spread the word. Tell others what today means and what you are doing to support it…
Being utterly besotted with the web, and especially the social web, as I am, I tend dislike nay-saying about its significance, and the manifold benefits this thing will bring to society, the world etc. You know the sort of Daily Fail nonsense: Facebook gives you cancer, Twitter rots your brain, bloggers never meet real people.
But there’s a difference between reactionary nonsense and thoughtful critiques. Over at the O’Reilly Radar blog, Joshua-Michéle Ross has been poking at some of the more troublesome prospects that social technologies bring. Like how much of our identity and personal data are we surrendering for analysis by corporations and governments (since analysis of that data is a big part of my business, but I also value personal freedom that’s a particularly interesting issue for me).
He takes through a series of four posts that I highly recommend reading:
hat makes this post extremely fascinating is that it comes from the O’Reilly Radar, which – in my experience anyway – have tended to be on the “cup overfloweth” side of the New New Social Thing, never mind a Glass Half Full – so this Glass Half Empty article – the first, it seems, of a series, is a rather fascinating shift of tenor, methinks.
He senses the beginning of a backlash, good and proper, perhaps coming from businesses (that aren’t managing to figure out how to get value out of networks as fast as Joshua-Michéle fears) as well as individuals wanting to rein in how much web shadow they are comfortable casting.
Meanwhile, Ian Delaney has a melancholy reflection on this subject that makes for good further reading and thinking matter, about how his early hopes that social media would bring socialist values to the fore are fading. He picks up the Panopticon analogy and extends it to society.
philosopher Michel Foucault back in the 70s picked up and ran with the idea of the Panopticon, especially in his best-known work Discipline and Punish. His idea was that Bentham’s model wasn’t just an idea for a prison; but for a society.
He argued that prisons are a really new idea. Back in the past, we simply thrashed/burned/drowned/stabbed transgressors. That all changed in the C18th with the Enlightenment . The idea of law-enforcement was ‘enlightened’ with the understanding that resources [people] didn’t need to be wasted and that better social control is exercised through freely-given compliance, rather than co-option.
People could be turned into machines, a consequence of political thinking in the emergence of industrial society and the rush to efficiency and cost-allocation. Once properly mechanised, they could be ‘trusted’ – the scare quotes, because the trusted prisoner is no longer human. A big part of that process is surveillance: once people know that they are always (potentially) watched, they’re a bit more compliant to the rules, and a bit more like machines.
Actually, Ian turns from melancholy to fighting talk. Where is the transgression, he asks? What passes for subversion online is often just prnaksterism, often funded to, in small feats of legerdemain to slip in a flash of brand in front of the viewer.
The echo chamber is another danger in all of this, Ian says. Where are the racists in his network?:
Racists are poised to take Stoke in the next by-election. They don’t appear on my spectrum because I have deliberately blinded myself to their existence on a day-to-day basis. Diversity of opinion is purely opt-in (with strong incentives to opt-out) in socialmediaworld.
Add some racists to your feed list? I don’t know about racists, but I enjoy having different views on hand in my inbox. I detest a great deal of what some political bloggers say, but I like to try and understand. Sometimes I have had my mind changed too. I understand people on the right (OK, mainly the centre right) much better than I did when I was a pre-web student. Then I used to sneer at people for reading the Telegraph for goodness sake. Now I’ll read it’s leaders and blog posts alongside Comment is Free and the Guardian.
I’ll unsubscribe because people are boring, not because I disagree. Maybe that’s just me. And maybe I need to listen more to some Green voices, some far right voices, some Socialist Workers Party voices.
All is not lost, I say. Fight on…This world is still ours to shape, perhaps as never before. We’re right to identify these pitfalls and blind alleys, but nothing is inevitable in all of this. There’s still a revolution to be had.
After we’ve read these warnings, go and read some Umair Haque manifesto. Then think about what you will do this year to change the world. Seriously.
For the timebeing, Google acts as a supreme court in a world of “sovereign users” clashing with ever increasing frequency with nation states that would prefer to have the last word on free speech.
Google acts like a benign dictator of the world’s data, which makes it important that we keep an eye on how it behaves and who is in charge of the decisions about what can and can’t be accessed via the company’s search engine and YouTube services.
An article in the New York Times (free registration may be needed – can never work out the NYT’s crazy system) takes a close look at the Google legal team and some of the legal struggles they have been involved in around the world. These cases and how The Goog handles itself give us a sense of how it is operating within the various codes of behaviour, mainly informal, that have emerged.
Voluntary self-regulation means that, for the foreseeable future, Wong and her colleagues will continue to exercise extraordinary power over global speech online. Which raises a perennial but increasingly urgent question: Can we trust a corporation to be good — even a corporation whose informal motto is “Don’t be evil”?
Governments of various repressive shades are testing Google all the time. While we’re all aware of the restrictions in China, and of the Thai and Turkish governements effectively ransom the company’s access to their citizens (and vice versa) in return for Google blocking access to certain materials, most often YouTube videos. And other attempts to clamp down on content and conversations are surprisingly common:
Over the past couple of years, Google and its various applications have been blocked, to different degrees, by 24 countries. Blogger is blocked in Pakistan, for example, and Orkut in Saudi Arabia. Meanwhile, governments are increasingly pressuring telecom companies likeComcast and Verizon to block controversial speech at the network level. Europe and the U.S. recently agreed to require Internet service providers to identify and block child pornography, and in Europe there are growing demands for network-wide blocking of terrorist-incitement videos. As a result, Wong and her colleagues said they worried that Google’s ability to make case-by-case decisions about what links and videos are accessible through Google’s sites may be slowly circumvented, as countries are requiring the companies that give us access to the Internet to build top-down censorship into the network pipes.
Google operates a “decider model” for what plays and doesn’t on YouTube, for instance. Basically decisions get escalated depending on their complexity. A further concern for us all, Rosenberg points out, is that this system isn’t very scalable at a time in the development of the web where video and indeed all forms of content are, well, scaling pretty rapidly…
I trust Google – for now. But it’s important that we keep watching. Last word to Rosen and Lawrence Lessig:
“During the heyday of Microsoft, people feared that the owners of the operating systems could leverage their monopolies to protect their own products against competitors,” says the Internet scholar Lawrence Lessig of Stanford Law School. “That dynamic is tiny compared to what people fear about Google. They have enormous control over a platform of all the world’s data, and everything they do is designed to improve their control of the underlying data. If your whole game is to increase market share, it’s hard to do good, and to gather data in ways that don’t raise privacy concerns or that might help repressive governments to block controversial content.”