Beyond Privacy: Siva Vaidhyanathan on the Hubris of Google

Geert Lovink
Geert Lovink

Introduction by Geert Lovink

Google Research

What is the current state of Google research? The first phase of Google research mainly dealt with the mystery of the algorithm and Google’s founders Larry Page and Sergey Brin. Google has since grown into a multi-faceted corporation which requires a move beyond the first phase. The second phase could be considered a European critical perspective:
2007: The Jan van Eijck Academy with Forum on Quaero: A public think tank on the politics of the search engine
2008: Vienna, Deep Search conference
2009: Amsterdam, Society of the Query conference
2010: Book by Peter Olsthoorn ‘De Macht van Google‘ which provides a comprehensive overview of the successes and failures of Google.

The European Continental approach is represented by Matteo Pasquinelli ((Google’s PageRank. Diagram of the Cognitive Capitalism and Rentier of the Common Intellect)), Dymitri Kleiner with his Telecommunist Manifesto and Yann Moulier Boutang who looks at Google as a form of cognitive capitalism. He analyzes how our queries are contributions and activities as a form of human pollination. With small activities contributed by billions we get to understand why Google is so big right now. On the other hand there are resentment theories like “Google is Evil,” how can we counteract that concern? Can we come up with an a-moral perspective that goes beyond good and evil?

Siva Vaidhyanathan
Siva Vaidhyanathan

Siva Vaidhyanathan, The Googlization of the Global Street

Google’s Hubris

Googlization is the term used by Siva to describe “the process of being processed, rendered and represented by Google.” It is what we experience through the lens of Google with information as the prime example. It focuses attention to a set of sources that filter all others, but what are the biases? The Googlization of us, our identities, our desires. “Don’t be Evil” is Google’s famous internal memo and its mission “to organize the world’s information and make it universally accessible and useful” is an audacious idea. It is the original sin of hubris. The mission statement is as old as the company (12 years). It is a totalizing statement: should it have warned us from the beginning? It is ubiquitous but we have not thought about it enough.

How does Google complicate our idea of privacy and our efforts to manage our reputations? We learn that we should not share some things with our parents or other people through trials and errors that lead us to adolescence. Privacy is used in many contexts and Google is involved in this in three ways:

  1. Record of our desires (also referred to by John Battelle as the Database of Intentions)
  2. Google makes available ‘obscure’ information
  3. It actively captures images (of us going through our daily lives, e.g. Google Street View)

The power of the default

Google is obsessed with giving us choices, not simple choices but complicated choices. The trade off is giving up your privacy for functionality but you are never given a contract for this transaction. Privacy is not a substance that cannot be counted or traded because it has different contexts. Google lives on the power of the default: all defaults retrieve a maximum amount of information to the advantage of Google. In the previous sentence Google may also be replaced by Facebook which has the same reputation. How can we counteract this? It requires a set of steps:

  1. You have to know
  2. You have to care that the deal is not fairly negotiated
  3. You have to explore (Google videos, how to)
  4. You have to act

In Richard Thaler and Cass Sunstein”s book Nudge (2007) they describe how design matters and how it is aligned with a choice architecture. They introduce the libertarian paternalism movement and they strive “to design policies that maintain or increase freedom of choice.” (( Richard H. Thaler and Prof. Cass R. Sunstein, Nudge: Improving Decisions About Health, Wealth, and Happiness, 1st edn (Yale University Press, 2008). )) (p. 5) It is a meaningful freedom with real control over issues of one’s life. Only the elite and proficient get to opt our of the choice architecture set by Google. A vulgar libertarianism where self help is available in the settings. It is not only elitist but it is also anti-social (you help yourself in the settings but not others).

Privacy interfaces & The Cryptopticon

There are different types of privacy interfaces:

  • Person-to-peer
  • Person-to-power
  • Person-to-firm
  • Person-to-state
  • Person-to-public

We are now all agents of surveillance as we all carry video cameras. Google and Facebook scramble the above described different contexts. Our new information system, with the example of Google Buzz, inflated and ignored these contexts. On top of that there is a difference between different types of content indexed by Google:

  1. Content on third party service websites
  2. User-generated content platforms from Google (Blogger, Orkut, etc.)
  3. Actively capturing content (Google Books).

These three require three levels of responsibility and regulation according to Siva.

Siva Vaidhyanathan
Siva Vaidhyanathan

Google seems to thrive on the economics of attention where:

  • Controversy is good
  • Users scouring for troubling photos saves Google staff labour time to do the same (I use the example of the GeenStijl community looking for “interesting” Google Street View pictures when first introducted in the Netherlands in my lectures for the first year students on crowdsourcing)
  • There is “User-generated editing”
  • Any time more people are using Google, it’s better for Google
  • Defaults are set for maximum exposure.

Opting-out of being indexed may be physical as in the case of people in Germany putting signs in their windows that said “Don’t include me in Google Street View.” Siva refers to Google’s default setting policy as protocol imperialism: Google likes us to be accustomed to their default settings.

The Cryptopticon

In Bentham’s Panopticon the instrument of surveillance was clear:  you could see the camera/mirrow. If you don’t see it, it can’t do its job. Now, the instrument of surveillance is hidden and we are not allowed to understand the instruments of surveillance at work. The government wants you to slip up. Facebook and Amazon want you to express your niche preferences to track idiosyncrasies. The opposite of the panopticon, the cryptopticon wants you to express deviations.

Q&A

During the Q&A it is mentioned that in the case of Google Street View the cars are visible and as such the instrument of surveillance is not hidden. Siva answers that the cryptopic example in this case would be that Google was also retrieving Mac addresses and Google claimed “they didn’t know.”

Rob Gonggrijp: How can the deal with Google ever be collective because of the different types of privacy (contexts)?
Siva Vaidhyanathan: A collective baseline with exceptions.

Colleague Michael Stevenson ends the evening with the question if it is enough to think about Google as Google effects. To what extent have we been Googlized already?

6 thoughts on “Beyond Privacy: Siva Vaidhyanathan on the Hubris of Google

  1. Pingback: Anne Helmond
  2. Pingback: Anne Helmond
  3. Pingback: Ismail Salhi
  4. Very enlightening.

    The emerging surveillance culture has piqued my interest lately. Although I cannot answer the question “to what extent have we been googlized already”, I can confidently assert that there is no going back anymore. It is just impossible to back track and erase our digital trails. Even if we could, at what expense?

Leave a Reply

Your email address will not be published. Required fields are marked *