Algorithmic Architects

My book project, based on my dissertation, focuses on the role of algorithms in technological systems. I am interested in understanding how the structures of digital media shape rhetorical events and agency in the information age.

My project seeks to understand how algorithms shape democratic engagement. In short, my argument is that algorithmic power comes from arrangement. Arrangement, in my view, is the prevailing rhetorical canon in the Attention Economy (Lanham) of networked media. Digging into ancient Greek theory and philosophy, I argue that the term kosmos best encapsulates how socio-material order flows in networked systems. Combining kosmos with posthuman views of agency, I analyze three case studies in algorithmic power and democracy:

  1. "Google Bombing in the Information Wars": This chapter studies how Google search results become sites for political contestation through Google bombing. I study the Google bombing of Rick Santorum, a campaign designed to defame the former Senator by offering an explicit alternative definition of Santorum. I also analyze the alt-right's use of Google bombing to spread white nationalism, including anti-Semitism.
  2. "YouTube's Algorithmic Agora": This chapter analyzes how YouTube's algorithms decide who can take part in the platform. I first focus on copyright algorithms, which in 2015 blocked Rand Paul's presidential campaign announcement video when it matched copyrighted audio. I also investigate how YouTube's moderation systems falsely flagged LGBTQ+ videos, cut off their revenue, and blocked them from young viewers in restricted mode.
  3. "Personalized Publics and Racializing Algorithms": This chapter examines how Facebook infers race and harms minority users. I first trace Facebook's deployment of "ethnic affinity advertisement," which allowed the site to push different advertisements for Straight Outta Compton, and which came under fire for allowing advertisers to illegally restrict minority users from viewing housing ads. I next assess Facebook's algorithmic moderation system and its tendency to protect white users while restricting black voices on the platform.

In each case, algorithms create a world that reflects the human biases of the "offline" world. Algorithms are not inherently objective, unbiased, fair, or just. Rather, we must recognize their tendencies to harm minority users, reify privilege, and prioritize profit. I offer in the conclusion a few strategies for creating better algorithms, starting with a simple notion: to improve our algorithms, we must improve ourselves.

Networked Realities

An off-shoot of my book project explores the relationships among algorithms, social platforms, and the “objectivity” of networked reality. This work has resulted in two publications thus far:

  • “Networked Reality and Technological Power: Argumentation and Memory in Facebook Memorials for Nelson Mandela,” published in Argumentation & Advocacy, and
  • “Timescape 9/11: Networked Memories,” which will be included in the forthcoming volume Networking Argument (Ed. Carol Winkler).

Both of these essays work from the premise that “real time” suggests authenticity and truth in networked environments—a premise that I adapt from Wendy Chun’s observations in Updating to Remain the Same. I contend that algorithms are the ultimate arbiters of networked reality, as they both observe the world “as it is” and create a new world in “real time.” What is offered by the algorithm is most frequently taken as truth, catalyzing what I describe as the “argumentative force of networked reality.” In both cases, the algorithm architects data-based memories, obfuscating its subjectivity by claiming to map only what is real.