Does Big Tech Mark the End of Democracy?

Devin Benson
4 min readSep 8, 2020

To answer this question I’ll start by defining “Big Tech” as the collection of companies and industry that profit and promote Surveillance Capitalism, an idea coined by Shoshana Zuboff in her book “The Age of Surveillance Capitalism.” This is the idea that our human experiences can be viewed as raw material by tech companies. These companies take our experiences (as recorded through our limitless metadata), run them through machine intelligence, and generate what Zuboff calls behavioral futures markets. These markets are then studied, bought, and sold for the purpose of extracting profit from us.

What does this look like?

  1. Human experiences: Your location is a great example of this. You use google maps for the convenience of getting to new places easily, while in the background these apps, services, products, and devices record metadata on every interaction you have. The metadata is collected by various companies, so they own this behavioral surplus. While they only need to know where you start and where you want to end up in order for the map to work, the constant collection of all other data grants them this surplus.
  2. Behavioral Surplus is analyzed by machine intelligence: The metadata is stored, analyzed, and used to ‘improve your experience’ by recommending the Thai place closest to your commute home when you open seamless on the train. The industry of Surveillance Capitalism would prefer that you stop digging here and believe that the constant monitoring is only a means to deliver you better service. But in reality, the findings from this data is where big tech makes its money. People could never draw conclusions from this amount of data as quickly as a computer could, meaning the monopoly on information that big tech enjoys comes from the monopoly on machine intelligence that analyzes your data.
  3. Behavioral Futures Markets: Big tech makes their profits when your future behavior can be bought and sold by the investor with the most money. While you only used google maps to get home after a long day at the office, the framework behind the technology knows that you left the office 3 hours later than usual and didn’t go to the gym like you usually do on Tuesdays. This surveillance when computed could predict that you are stressed with work, and are more likely to make an impulse buy to make yourself feel better. This industry can then sell your attention (while you’re at your most vulnerable) to whoever will pay for it. The price for your attention increases with the accuracy of your predictive behavior.

At a glance this is just marketing, but when you really think about how our access to information (like the best coat to buy online) is both filtered through, and manipulated by big tech we start to see that our future behavior is for sale to the highest bidder.

A real example of this is Facebook showing female users more depressing items in their feed on Monday mornings. Moods can be understood by what emojis they send. Lowered moods makes them more likely to buy clothes and makeup. Facebook can then sell ads for clothing and makeup at higher prices on Mondays.

Another example is a marketing firm purchasing credit and bank data from data brokers. This can be used to target low income communities for predatory pay-day loans during the days before bills are due. In a country where socio-economic lines are often drawn around lines of class, this increases economic inequality between racial demographics by deciding which financial products are marketed to different groups.

The scariest examples revolve around our democratic process. What if Facebook knew that if adding an “I voted” feature to your profile meant that the friends who saw your “I voted” badge were 4% more likely to vote themselves? What if that information was for sale, and someone with the right algorithm and understanding of electoral politics could purchase “I voted” buttons in the right swing counties, in the right swing states, for the right party of voters where a 4% increase in turnout could decide a Presidential election? Even scarier, what if this information wasn’t for sale, but still usable by Facebook as a corporate entity?

This leads me to think that “tech” as a collection of tools and discoveries isn’t leading to the end of American Democracy, but I am scared of what these tools can do in the hands of Surveillance Capitalists.

Possible solutions?

You as a user own your “behavioral surplus.” It can’t be studied, analyzed, or sold without your consent.

Products from these tech firms (like marketing based on moods that they affect) have to go through the same types of ethical panels used in sociology or medical studies.

Related Resources:

‘The Age of Surveillance Capitalism” by Shoshana Zuboff

“Data and Goliath” by Bruce Schneier

“The Great Hack” directed by Karim Amer and Jehane Noujaim

--

--