Fiduciary Data Banking: A Market-Based Alternative to Surveillance Capitalism

Part I: Describing the Problem

Seth Goldstein
7 min readApr 2, 2019

Privacy in the Age of Social Media

Facebook prompts its 2 billion users every day with a simple question, ”What’s on your mind?” This gentle nudge is the beginning of a rich funnel of engagement cues that remove the friction of privacy concerns from users and helps drive the company’s $70 billion dollars of annual advertising revenues. Similarly, the flashing cursor inside of Google’s search box conditions billions of users to share their most intimate intentions, generating more than $100 billion dollars a year in sales.

The immediate benefits of convenience often distract users from longer term privacy interests, and so Internet companies keep making it easier to engage with their algorithms: from Facebook Connect and Google oAuth to the emergence of intelligent assistants like Amazons Alexa and Apples Siri.

The slow and steady sublimation of privacy to convenience numbs us to its loss. Each data exposure leads to a death of a thousand cuts, and the constant stream of privacy breaches makes them seem inevitable. Facing Congress in February 2019, the CEO of Equifax refused to disclose his own date of birth and social security number, despite the fact that his company leaked such information for over 140 million Americans. Marriott, in plain text, lost identity and payment credentials for millions of guests. Outrage becomes neutered with each news cycle. Our increasing desensitization to the loss of privacy creates profound and pervasive emotional consequences.

In his 1968 essay Privacy and Freedom, Westin describes a particular anxiety that results from the loss of these privacy rights:

Knowledge or fear that one is under systematic observation in public places destroys the sense of relaxation and freedom that men seek in open spaces and public arenas.

Almost fifty years later, in their paper Taking Trust Seriously in Privacy Law, Neil Richards and Woody Hartzog describe a feeling state that has evolved from anxiety to depression:

People feel confused and disempowered when it comes to their data. Instead of feeling confident that we will be protected when we share information with others, we increasingly feel helpless and resigned to our fate.

Daniel Solove suggests in his 2008 paper on Conceptualizing Privacy that privacy comprises the following rights:

  1. right to be left alone
  2. right to limited access to the self
  3. right to secrecy
  4. right to control of personal information
  5. right to person-hood
  6. right to intimacy

The growth of smart phones, social apps, and AI puts individuals at a distinct disadvantage in terms of protecting their privacy. The lure of addictive “free” products such as the Facebook Newsfeed, Instagram Stories, or Google Search/Maps/Gmail blinds users to the hidden costs of these products: loss of privacy, loss of agency, loss of control over the means of personal data production.

Shoshanna Zuboff describes these costs further in her 2015 essay Big Other: Surveillance Capitalism and the Prospects of an Information Civilization

Although some of these data are applied to product or service improvement, the rest are declared as a proprietary behavioral surplus, fed into advanced manufacturing processes known as machine intelligence, and fabricated into prediction products that anticipate what you will do now, soon, and later. Although the saying tells us “If it’s free, then you are the product,” that is also incorrect. We are the sources of surveillance capitalism’s crucial surplus. Surveillance capitalism’s actual customers are the enterprises that trade in its markets for future behavior.

Internet product managers and engineers develop addictive feedback loops through notifications and other dopamine-driving rewards. The more engaging these apps are to users, the more personal data they are able to harvest. Companies have defended their practice of user data extraction by suggesting that they have provided adequate notice and consent, and that their users are explicitly opting-in to share their personal data.

In reality, users scroll quickly through privacy policies and click “I Understand.” As Alexis Madrigal explains in her article in the Atlantic:

So, each and every Internet user, were they to read every privacy policy on every website they visit would spend 25 days out of the year just reading privacy policies! If it was your job to read privacy policies for 8 hours per day, it would take you 76 work days to complete the task.

These asymmetric contracts reduce agency and are non-negotiable. Intermediaries like FB Connect and Google oAuth reinforce this inequality by functioning like “fund of funds” in the allocation of consumer permissions to 3rd party apps. Meanwhile, companies establish rich profiles of their users without them knowing what it known about them (for example, 74 percent of Facebook customers don’t realize the site collects their interests to target ads). This dynamic represents the growing challenge of inverse privacy:

Due to progress in technology, institutions have become much better than you in recording data. As a result, shared data decays into inversely private. More inversely private information is produced when institutions analyze your private data.

The challenge therefore is that not only do companies have access to private user data, but they are far better at understanding this information than the ”owners” of it are themselves.

The Slippery Ice of Surveillance Capitalism

Shoshanna Zuboff’s seminal new work The Age of Surveillance Capitalism describes in detail the economic consequences of inverse privacy. The difference between what users know about themselves and what others know about them becomes a vital source of surplus value. She describes her topic of surveillance capitalism as: “a new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction and sales.”

Given the technology infrastructure and cash reserves of the Internet giants, it is hard to imagine surveillance capitalism slowing anytime soon. How might the surplus value currently held by companies such as Facebook and Google be redistributed to users?

Surveillance capitalists sell probabilities of what users will do next: they harvest historical data in order to train AI systems to better predict future behavior. Their factories produce “free” social media, search, email and other services that attract user attention. Their business model process starts with enabling a user to do something easy (ie ”what’s on your mind?”), anticipating what they might do next (ie search auto-complete), then establishing predictive models for sale to marketers (which lead to more active behavioral modification such as nudging, influencing and eventually actuating a commercial transaction).

User experience engineers and growth hackers optimize on-boarding flows and notification schemes to trigger bursts of dopamine that ease users down conversion funnels. As early Facebook employee Jeff Hammerbacher said memorably in 2011, “The best minds of my generation are thinking about how to make people click ads.”

As Hartzog testified before Congress’ March 2019 hearing on “Policy Principles for a Federal Data Privacy Framework in the United States,”

Companies create manipulative interfaces that exploit our built-in tendencies to prefer shiny, colorful buttons and ignore dull, grey ones. They may also shame us into feeling bad about withholding data or declining options. Many times, companies make the ability to exercise control possible but costly through forced work, subtle misdirection, and incentive tethering. Sometimes platforms design online services to wheedle people into oversharing, such as keeping a streak going or nudging people to share old posts or congratulate others on Facebook. Companies know how impulsive sharing can be and therefore implement an entire system is set up to make it so easy.

Privacy, however, is friction in this model: it is a monkey wrench wedged into the conversion funnel, one that users can hold onto so as not to fall into a pool of their own behavioral surplus. In his 1951 classic Philosophical Investigations, Wittgenstein defends the value of cognitive friction as essential to clear thinking:

We have got on to slippery ice where there is no friction and so in a certain sense the conditions are ideal, but also, just because of that, we are unable to walk. We want to walk: so we need friction. Back to the rough ground!

The concept of rough ground is important. For twenty years, since the inception of the Web as a commercial medium in the late nineties, the overriding business logic of the Internet could summarized in two words: increase conversion. After years of promoting the values of personalization and convenience, there is a growing sense that these values have had long-term unintended consequences in terms of losing privacy, agency and control.

There are a number of potential solutions that stakeholders in this ecosystem might adopt to reverse some of these dangerous trends. For example:

  • Regulators could mandate privacy by design.
  • Advertisers and Investors could stop purchasing dirty, conflicted data.
  • Internet Platforms might self-regulate and create speed bumps for their most addicted users.
  • Users could adopt strategies to shield themselves from surveillance.

In her recent paper The Antitrust Case Against Facebook, Dana Srinivasan calls on empowering consumers with the ability to say no to surveillance in the form of a Do Not Track switch:

The fact that this century’s new communications utility is free but necessitates widespread surveillance of consumers is a paradox in a democracy. Facebook watches, monitors, and remembers what over 2 billion people do and say online. Contrary to what those in the advertising industry would have regulators think, American consumers value a state of no surveillance and have attempted to protect this aspect of their privacy since the beginning. The fact that the free market today offers no real alternative to this exchange is a reflection only of the failure of competition… For this, we need to empower consumers with a singular Do Not Track switch that can counter the collusion in the horizontal market. Consumers must be able to just say no to commercial surveillance.

Whether it is a pervasive NO button that can be invoked across the Internet to stop sharing data with selected entities (the anti-like?), or related tools to disguise future intentions, a new era of consumer data activism is poised to help people become less predictable to the tools and technologies they use — and are used by.

I wrote this with my technical co-founder of Fiducia, Pavel Machalek. In Part II, we propose a framework for fiduciary data banking that we believe addresses this problem with a market-based solution.

--

--

Seth Goldstein

Mission-Driven Entrepreneur, Artist, Angel, Mentor, Mensch: Spartacus / Turntable / Majestic Research / SiteSpecific. More on me at www.sethgoldstein.com