Introduction by Geert Lovink
What is the current state of Google research? The first phase of Google research mainly dealt with the mystery of the algorithm and Googleâ€™s founders Larry Page and Sergey Brin. Google has since grown into a multi-faceted corporation which requires a move beyond the first phase. The second phase could be considered a European critical perspective:
2007: The Jan van Eijck Academy with Forum on Quaero: A public think tank on the politics of the search engine
2008: Vienna, Deep Search conference
2009: Amsterdam, Society of the Query conference
2010: Book by Peter Olsthoorn ‘De Macht van Google‘ which provides a comprehensive overview of the successes and failures of Google.
The European Continental approach is represented byÂ Matteo Pasquinelli ((Googleâ€™s PageRank. Diagram of the Cognitive Capitalism and Rentier of the Common Intellect)), Dymitri Kleiner with his Telecommunist Manifesto and Yann Moulier Boutang who looks at Google as a form of cognitive capitalism. He analyzes how our queries are contributions and activities as a form of human pollination. With small activities contributed by billions we get to understand why Google is so big right now. On the other hand there are resentment theories like “Google is Evil,” how can we counteract that concern? Can we come up with an a-moral perspective that goes beyond good and evil?
Siva Vaidhyanathan, The Googlization of the Global Street
Googlization is the term used by Siva to describe “the process of being processed, rendered and represented by Google.” It is what we experience through the lens of Google with information as the prime example. It focuses attention to a set of sources that filter all others, but what are the biases? The Googlization of us, our identities, our desires. “Don’t be Evil” is Google’s famous internal memo and its mission “to organize the worldâ€™s information and make it universally accessible and useful” is an audacious idea. It is the original sin of hubris. The mission statement is as old as the company (12 years). It is a totalizing statement: should it have warned us from the beginning? It is ubiquitous but we have not thought about it enough.
How does Google complicate our idea of privacy and our efforts to manage our reputations? We learn that we should not share some things with our parents or other people through trials and errors that lead us to adolescence. Privacy is used in many contexts and Google is involved in this in three ways:
- Record of our desires (also referred to by John Battelle as the Database of Intentions)
- Google makes available ‘obscure’ information
- It actively captures images (of us going through our daily lives, e.g. Google Street View)
The power of the default
Google is obsessed with giving us choices, not simple choices but complicated choices. The trade off is giving up your privacy for functionality but you are never given a contract for this transaction. Privacy is not a substance that cannot be counted or traded because it has different contexts. Google lives on the power of the default: all defaults retrieve a maximum amount of information to the advantage of Google. In the previous sentence Google may also be replaced by Facebook which has the same reputation. How can we counteract this? It requires a set of steps:
- You have to know
- You have to care that the deal is not fairly negotiated
- You have to explore (Google videos, how to)
- You have to act
In Richard Thaler and CassÂ Sunstein”s book Nudge (2007) they describe how design matters and how it is aligned with a choice architecture. They introduce theÂ libertarian paternalism movement and they strive “to design policies that maintain or increase freedom of choice.” (( Richard H. Thaler and Prof. Cass R. Sunstein, Nudge: Improving Decisions About Health, Wealth, and Happiness, 1st edn (Yale University Press, 2008). )) (p. 5) It is a meaningful freedom with real control over issues of one’s life. Only the elite and proficient get to opt our of the choice architecture set by Google. A vulgar libertarianism where self help is available in the settings. It is not only elitist but it is also anti-social (you help yourself in the settings but not others).
Privacy interfaces & The Cryptopticon
There are different types of privacy interfaces:
We are now all agents of surveillance as we all carry video cameras. Google and Facebook scramble the above described different contexts. Our new information system, with the example of Google Buzz, inflated and ignored these contexts. On top of that there is a difference between different types of content indexed by Google:
- Content on third party service websites
- User-generated content platforms from Google (Blogger, Orkut, etc.)
- Actively capturing content (Google Books).
These three require three levels of responsibility and regulation according to Siva.
Google seems to thrive on the economics of attention where:
- Controversy is good
- Users scouring for troubling photos saves Google staff labour time to do the same (I use the example of the GeenStijl community looking for “interesting” Google Street View pictures when first introducted in the Netherlands in my lectures for the first year students on crowdsourcing)
- There is “User-generated editing”
- Any time more people are using Google, it’s better for Google
- Defaults are set for maximum exposure.
Opting-out of being indexed may be physical as in the case of people in Germany putting signs in their windows that said “Don’t include me in Google Street View.” Siva refers to Google’s default setting policy as protocol imperialism: Google likes us to be accustomed to their default settings.
In Bentham’s Panopticon the instrument of surveillance was clear:Â you could see the camera/mirrow. If you don’t see it, it can’t do its job. Now, the instrument of surveillance is hidden and we are not allowed to understand the instruments of surveillance at work. The government wants you to slip up. Facebook and Amazon want you to express your niche preferences to track idiosyncrasies. The opposite of the panopticon, the cryptopticon wants you to express deviations.
During the Q&A it is mentioned that in the case of Google Street View the cars are visible and as such the instrument of surveillance is not hidden. Siva answers that the cryptopic example in this case would be that Google was also retrieving Mac addresses and Google claimed “they didn’t know.”
Rob Gonggrijp: How can the deal with Google ever be collective because of the different types of privacy (contexts)?
Siva Vaidhyanathan: A collective baseline with exceptions.
Colleague Michael Stevenson ends the evening with the question if it is enough to think about Google as Google effects. To what extent have we been Googlized already?