Yesterday at DLD began with Facebook’s VP Communications and Public Policy, Elliot Schrage, defending the company’s critics. What is Facebook going to do about Facebook was his theme, where everyone else is asking what are we going to do about Facebook — and Amazon and Google and – maybe not so much – Apple.
There were two lines of attack from questions in the crowd – Facebook is to blame for fake news skewing elections and Facebook had the media industry help build it as a publishing channel and now is going to screw them.
The fake news question – Kara Swisher asked if underinvestment in prevention was a management issue – was responded to along the lines of hands up, sorry, but the government, the CIA and FBI missed it too. A US questioner, responder and company involved explains the US-centric response there, even if he was on stage in Munich. Similar questions hang over Facebook ads and algorithms and influence over Brexit, the Catalonian revolution. Still, when you’re being blamed for Trump getting elected in your own country, I can see why those other elections might seem peripheral.
Image: Kara Swisher in Sarah Connor mode at DLD18 – glasses doubtless to shield against the ferocious brightness of the enormous LED displays on stage.
On publishing, Schrage also acknowledged the criticism but stressed that the company was interested – in fact Zuckerberg himself was especiallly in doing right by users:
“Mark is committed… to helping promote strong communities and an informed public. Trusted publications will benefit.”
Later in the morning, documentary filmmaker and artist, Hito Steyerl offered a direct critique of Facebook’s new policy to weed out fake news sites by asking users which sources they trusted:
“As a trained documentary filmmaker, I can tell you a credible source is not defined by popularity.”
Andrew Keen, long time critic of Silicon Valley and the tech sector’s power, was on stage following Facebook in conversation with Paul-Bernhard Kallen, CEO of Germany’s Herbert Burda Media.
Immediately invoking the concept of surveillance capitalism, Keen asked why the big US tech companies won’t accept their responsibilities as media platforms, something Kallen attributed to legal liability loopholes during the Clinton administrations laying of the benign legislative landscape that made possible the growth of the GAFA (Google Amazon Facebook Apple) or the Stacks, as Bruce Sterling named them when he spotted their developing dominance about six years ago.
In Kallen’s analysis the next generation of online services would be far more responsible in how they understood and managed their influence on society. Whether that generation of services came from the incumbent tech giants or new players, he couldn’t predict. Disrupting models like the blockchain and decentralised web forces could make these future companies very different to the ones we see today – take a look at platform co-ops and ownerless blockchain powered corporations for hints of what might come to pass.
Image: Hito Steyerl
In the session that featured Hito Steyerl, whom I mentioned earlier in this post, there was some expansive and edge thinking from both her and Evgeny Morozov, well chaired by the Serpentine Gallery’s Hans Ulbrich Obrist. This session seemed to suffer from small migrations of executives leaving – bored, confused or just hungry (it was running into lunchtime at this point). It was one of the least crowded of the sessions int he main hall, which was a shame as the ideas were deeply relevant, if very challenging.
Here are some of the threads from Morozov and Steyerl that I want to pick up for more research and thinking later on:
- The digital intermediation of everything: this is the title of an essay by Morozov looking at how the economics and power of big tech companies owning your data will play out.
- Techno-religiosity: Something that Yuval Noah Harari explored in Homo Deus and a Google talk which Steyerl referred to. Steyerl says: “The more advanced the technology, the more likely people begin to discuss it in religious or spiritual terms” – think AI, especially.
- Orbis theory: The theory that in the 1600s the mass colonisation of the Americas caused a lowering of CO2 levels globally, as 50 million indigenous people died and forests reclaimed their farmland and the trees processed more carbon.
- Owning your own data: The importance of this is huge when looked at it in the economic and political context. I recall the vendor relationship management movement that began ten years ago and similar efforts, but now GDPR (General Data Protection Regulation) in Europe is creating a legal push for it. Morozov also alluded to the idea of states nationalising data to protect its citizens from corporations.
- Access to people’s data was about better advertising, now it is also about fuelling AI:The big tech companies are using the data they get from people in return for free services (search, social networks, convenient ecommerce) to develop their machine learning abilities.
- Benevolence can turn to indifference – and it would hurt us: Services from Google and Facebook feel like, and for now are, a social good – so many people wouldn’t be able to afford what the tech giants give us for free. But if they no longer need the data from users, that benevolence would likely turn to indifference. Effectively Morozov is saying, what if advertising wasn’t the main revenue stream for these companies and they got their money from B2B services? What would happen to the services they provide?
- The coming machine learning monopoly: Morozov said that if he were in business he would be worried about the growing power of the four giants as the sole providers of machine learning. Once they are the best, they will dictate terms of access to others.
At the end of the first day of DLD 18 there was blood in the water and no one was sure whether it’s theirs or Facebook and Google’s… or both.