by Andrew Hua
A scant line of police in front of the highest legislature in the country. Countless men, a sea, forming in front of them. A moment, then an eruption of chaos. A breach, the abyss opening before our eyes. These are the telling signs of a coup in progress. According to Shoshana Zuboff, author of The New York Times article “The Coup We Are Not Talking About,” the attack on the US Capitol on January 6th, 2021 masks the alarming progression of a coup more pernicious and harder to see: the capture, monopolization, and manipulation of personal and societal information by corporations that threatens to override democratic institutions.
Of course, no coup occurs in a vacuum. Much like political coups, this new type of coup requires a foundation, support, and execution. First, Zuboff explains, privacy laws were eroded amid the fallout of the 9/11 attacks. At the same time, social media companies grew in size and prevalence. Indeed, during the early twenty-first century, digital platforms such as Facebook, Google, Twitter, and many others ballooned until they became a ubiquitous part of life. According to Statista, over 311 million Americans used the internet in 2022, with 41% agreeing in a 2023 poll that “[they] could no longer imagine [their] everyday life without the internet” (Petrosyan; Bashir). These two factors combined allowed social media companies to amass more and more personal information gathered from online interactions, and government agencies have begun to rely on this data collection to maintain their surveillance reach. As Zuboff writes, “[the] revolutionary roots of surveillance capitalism are planted in this unwritten political doctrine of surveillance exceptionalism . . . granting the new internet companies a license to steal human experience and render it as proprietary data.”
With the government giving tacit approval to large-scale surveillance, there followed a “sharp rise in epistemic inequality,” which Zuboff explains as “the difference between what I can know and what can be known about me.” Since any user engagement is profitable for companies through advertisement, increasing engagement is the end to which any and all means are justified. As Andrew “Boz” Bosworth, Facebook executive, writes in a leaked memo, “We connect people . . . . The ugly truth is . . . anything that allows us to connect more people more often is *de facto* good” (Mac et al.). As social media companies have accumulated and aggregated consumers’ information for profit, their ability to manipulate individuals’ behaviors and preferences has grown more and more, driving user engagement to a critical point. At this critical third point—which Zuboff claims is happening now—companies’ optimizing for profit has come into conflict with reality itself. As Zuboff notes, “surveillance capitalism’s operations have no formal interest in facts. All data is welcomed as equivalent.” Companies recommend increasingly personalized streams to users that are “radically indifferent to meaning, facts and truth” (Zuboff). The capacity “to microtarget users [and] manipulate them” results in epistemic chaos, a worsening state of societal fragmentation (Zuboff). Ultimately, democratic institutions will no longer bear the weight of such a collapse and informational institutions will sweep in, finally supplanting democracy and completing the coup. This is the final, fourth stage of the epistemic coup according to Zuboff, one based on information instead of force.
What Zuboff neglects to fully explain, however, is how we have fallen prey to infiltration by social media. How have people grown to trust algorithms and social media while growing increasingly distrustful of governments, corporations, and, eventually, each other?
Arthur C. Clarke, the famous science fiction writer, claims “any sufficiently advanced technology is indistinguishable from magic” (Clarke 21). Similarly, academic and video game designer Ian Bogost claims that as technology accelerates faster and faster, its ever-increasing complexity provides the means for obfuscation and enchantment. In an essay for The Atlantic, Bogost writes that public ignorance about the true workings of algorithms and computer systems creates the idea that an algorithm is “concise and efficient,” a “flawless little trifle of lithe computer code, processing data into tapestry like a robotic silkworm.” In reality, he points out, behind the endless streams of proclamations in the public sphere sits a complex system of infrastructure, labor, and assets, all of which are neatly ignored by users who instead believe it’s all done by an algorithm. For example, when Netflix hosted a competition in search of an ultimate algorithm for recommending films, the company found out that there was no golden algorithm. Instead, they ended up relying on countless hours spent manually classifying films for compilation and recommendation (Bogost). This simplistic vision that leaves out human labor results in the fiction of an ultimate truth machine, knowing all, correct in all, and capable of all. Bogost argues the belief that our trust in algorithms can approach “theology”: our “supplication made to . . . computers” addresses the authority of a higher power while ignoring the complex computational systems whose projections of truth we take on “faith.”
Consequently, people are often oblivious to the real gaps and shortfalls of algorithms. Since no one can completely grasp the effort, scale, and technology invested to produce these algorithms, people are left with a raw belief in the trustworthiness and validity of digital products. As a benign example of this adherence, consider the countless cases of drivers following the instructions given by their GPS to absurd ends. In Belgium, Sabine Moreau needed to drive to Brussels, ninety miles away. To get there, she followed the instructions from her GPS, which guided her across Central Europe until she arrived at Zagreb, Croatia, about 810 miles in the opposite direction (Matyszczyk). In Australia, a trio of Japanese tourists followed GPS instructions to drive to an island separated from the mainland by nine miles of water. Ultimately, they made it fifty yards into the mixture of water and mud before they realized they were stranded (Fujita). Although cases like these are mostly limited to lost vehicles and humorous anecdotes, they reveal an inherent trust in computer systems and algorithms. And when thousands of people carry the same false beliefs propagated by social media, which also serve as platforms to organize, disasters like January 6th follow. Worryingly, this case is not alone. A similar mass political attack took place in Brazil on January 8th, 2023, complete with calls to “Stop the Steal” and massive propagation of allegations of electoral fraud via social media, ultimately resulting in riots in key buildings in the capital, Brasilia (Dwoskin).
This is epistemic chaos, and Bogost’s ideas of faith give us a deeper understanding of how we enabled it to emerge. It is these systems of faith, built from our ignorance about the true inner workings of the systems that drive algorithm recommendations that reinforce epistemic imbalance and strengthen the epistemic coup.
So, what comes next for societies like ours? It is clear that the coup is quickly progressing through its third stage, so it must be rapidly approaching the fourth. Zuboff rather curtly explains that the fourth stage features the institutionalization of “epistemic dominance” and the “overriding [of] democratic governance with computational governance by private surveillance capital.” However, what exactly does that dominance entail? How does an ostensibly democratic society with free institutions regress to outright plutocracy?
Like the epistemic coup, this process is insidious and gradual, advancing from covert to overt. In an investigation of how data is used and misused in a digital society, John Cheney-Lippold cites a predictive policing project conducted by the Chicago Police Department (CPD). He likens the Chicago project to the film Minority Report, in which clairvoyants (“precogs”) expose future criminals to preemptive detainment. The CPD, for its part, used algorithms that take crime and social statistics and algorithmic processing to identify so-called “at-risk” individuals. Acting upon these predictions, police responded proactively to those deemed “potential” victims or perpetrators (Cheney-Lippold 22). Though such measures are touted as beneficial both for the public and police efficiency, they show how human capacity and responsibility for decision-making could be gradually supplanted by algorithmic sources. Cross this scenario with the speculative nightmare of Robocop, in which the Detroit Police Department is taken over by a private corporation, and Zuboff’s fourth stage begins to look pretty threatening.
Today, surveillance is born from corporate interests, producing surveillance capitalism. But when the government also participates in surveillance, it unleashes the terrifying merger of state and information. The 2018 Cambridge Analytica scandal, in which Facebook user data was mined for targeted political advertising, has already shown how misuse of a tech company’s “suite of capabilities” can “pivot[] the whole machine . . . from commercial to political interests” (Zuboff). Algorithmic models like the CPD’s make predictive labels for individuals that may have no relation to the reality they attempt to describe. As Cheney-Lippold notes, “Measurable types of identity like Google’s ‘gender’ or the CPD’s ‘at risk’ are . . . ever changing, dependent on flows of available data” (31). So, while the labels associated with individuals’ data sets have no stable meaning, their assignment has real impact on those people deemed victims or perpetrators. As Cheney-Lippold puts it, “We can think of a measurable type like ‘at risk’ as a hieroglyph, not a truth of identity but a priestly interpretation” (24).
Bogost’s idea of algorithmic faith elevates this danger one step higher because algorithms’ imperfections grant plausible deniability to those who rely at the same time on their perception of infallibility. Cheney-Lippold illustrates this dynamic with a case where facial recognition technology in an HP computer failed to recognize an African-American face. The response by the company? “We are working with our partners to learn more. The technology we use is built on standard algorithms . . . ” (16). While this case was mostly benign and, hopefully, not the intention of the designers of the technology, the response HP gave shows how algorithms can be used to obscure real biases and real oppression. Would the average person be able to tell the difference between accidental bias and programmed bias? Would the average person recognize if and when such bias bolsters government power through suppression?
The amorphous nature of predictive models’ results and inscrutability means that neither the governed nor the governing can truly comprehend the full system in place. Even more sinisterly, the plausible deniability granted by the usage of algorithms allows authorities to make decisions that are actively detrimental or oppressive to their citizenry while hiding behind the trust given to algorithms. For a terrifying example, examine the Israel-Hamas war. Amidst the harrowing bombardment of Gaza, recent news has emerged that the Israeli military has been utilizing Lavender, an AI machine learning system, to assist its attacks on the Gaza Strip by targeting Hamas members or affiliates. The system was conceived in order to resolve a so-called “human bottleneck for both locating the new targets and decision-making to approve the targets,” as written in a 2021 book by Brigadier-General Y. S., confirmed to be the commander of an “elite Israeli intelligence unit” (Abraham). Along with authorization to cause collateral damage, with “the number of civilians they were allowed to kill alongside each target . . . [being] up to 20,” the protocol of the AI targeting system does not require soldiers “to independently check why the machine made that choice or to examine the raw intelligence data on which it [was] based” (Abraham). In a shocking display of algorithmic trust, the army has relied on a system with known flaws and only “90 percent accuracy in identifying an individual’s affiliation with Hamas.” It is shocking that the human checks and verification received less priority as the war progressed, often relegated to seconds, and in some cases only considered whether the target was male instead of female (Abraham). The Lavender system shows how an authoritative technology can provide cover for visibly brutal actions.
The companies and military authorities discussed above all function under the umbrella of supposed democratic institutions. But as control and decision-making are transferred from humans to algorithms, we will become less inclined to analyze the mechanisms by which our decisions are being made: the demos in democracy will gradually change value and lose its exclusively human nature. Knowing the steps of the epistemic coup, and how each step further eclipses avenues of resistance, means knowing that we have a limited time to staunch the tide of surveillance. If we squander it, Zuboff warns, it won’t be long before “we . . . hear banging on the Capitol doors once again.”
Works Cited
Abraham, Yuval. “ ‘Lavender’: The AI Machine Directing Israel’s Bombing Spree in Gaza.” +972 Magazine, 3 Apr. 2024, www.972mag.com/lavender-ai-israeli-army-gaza.
“Attitudes towards the Internet in the United States 2022.” Statista, www.statista.com/forecasts/997172/attitudes-towards-the-internet-in-the-us.
Bogost, Ian. “The Cathedral of Computation.” The Atlantic, 15 Jan. 2015, www.theatlantic.com/technology/archive/2015/01/the-cathedral-of-computation/384300/.
Cambridge Dictionary. “Paternalism.” Cambridge Dictionary, 8 Jan. 2020, dictionary.cambridge.org/us/dictionary/english/paternalism.
Cheney-Lippold, John. We Are Data : Algorithms and the Making of Our Digital Selves. New York, NY, New York University Press, 2017.
Clarke, Arthur C. “Profiles of the Future: An Inquiry into the Limits of the Possible.” New York, Harper & Row, 1973.
Dwoskin, Elizabeth. “Come to the “War Cry Party”: How Social Media Helped Drive Mayhem in Brazil.” Washington Post, 9 Jan. 2023, www.washingtonpost.com/technology/2023/01/08/brazil-bolsanaro-twitter-facebook/.
Fujita, Akiko. “GPS Tracking Disaster: Japanese Tourists Drive Straight into the Pacific.” ABC News, 16 Mar. 2012, abcnews.go.com/blogs/headlines/2012/03/gps-tracking-disaster-japanese-tourists-drive-straight-into-the-pacific/.
Jia, Katherine M., et al. “Estimated Preventable COVID-19-Associated Deaths due to Non-Vaccination in the United States.” European Journal of Epidemiology, vol. 38, 24 Apr. 2023, pp. 1125–1128, link.springer.com/article/10.1007/s10654-023-01006-3, doi.org/10.1007/s10654-023-01006-3.
Mac, Ryan, et al. “Facebook Executive in 2016: “Maybe Someone Dies in a Terrorist Attack Coordinated on Our Tools.”” BuzzFeed News, 29 Mar. 2018, www.buzzfeednews.com/article/ryanmac/growth-at-any-cost-top-facebook-executive-defended-data#.upw3jdyR8.
Matyszczyk, Chris. “GPS Sends Belgian Woman to Croatia, 810 Miles out of Her Way.” CNET, 14 Jan. 2013, www.cnet.com/culture/gps-sends-belgian-woman-to-croatia-810-miles-out-of-her-way/?ei=mggWVab8IYO6afO0gJAE&ved=0CB8Q-AsoADAB&usg=AFQjCNExe6ahV6jSXJsc6Df0hNfjd1DtpA.
Maranto, Lauren. “Who Benefits from China’s Cybersecurity Laws?” Center for Strategic & International Studies, 25 June 2020, www.csis.org/blogs/new-perspectives-asia/who-benefits-chinas-cybersecurity-laws.
Martínez, A, and Allison Aubrey. “How Vaccine Misinformation Made the COVID-19 Death Toll Worse.” NPR, 16 May 2022, www.npr.org/2022/05/16/1099070400/how-vaccine-misinformation-made-the-covid-19-death-toll-worse.
NVIDIA. “NVIDIA: World Leader in Artificial Intelligence Computing.” NVIDIA, 2023, www.nvidia.com/en-us/.
Petrosyan, Ani. “Countries with the Largest Digital Populations in the World as of January 2023.” Statista, 5 Apr. 2023, www.statista.com/statistics/262966/number-of-internet-users-in-selected-countries/.
South, Todd. “Army Approves next Phase for Augmented Reality Device.” Army Times, 7 Sept. 2023, www.armytimes.com/news/your-army/2023/09/07/army-approves-next-phase-for-augmented-reality-device/.
Zuboff, Shoshana. “The Coup We Are Not Talking About.” The New York Times, 29 Jan. 2021, www.nytimes.com/2021/01/29/opinion/sunday/facebook-surveillance-society-technology.html.