Preamble

This essay was written for a course in the philosophy of technology and seemed worth posting / preserving. The essay prompt was something along the lines of: “Is Zuckerman’s “Digital Public Infrastructure” (DPI) framework a good way to address the representational harms raised in Safiya Noble’s book, Algorithms of Oppression?”

While I am somewhat critical of DPI here, I want to be clear that I am overall sympathetic to DPI–and more broadly, democratically-informed technological design–as a direction for addressing at least some (maybe many?) of the internet’s problems. I think that some of the critiques I am raising here are actually deeper issues related to minority rights and protections under democracy (which also apply to the DPI vision). In short, to “democratize” something is not per se to ensure that it is just or fair or aligned with any particular vision of the good. In general I would like to better understand themes from democratic theory on this topic – e.g. “tyranny of the majority” and the problem of “persistent minorities”. For now, on to the essay.

Representational Harms & Digital Public Infrastructure

In Algorithms of Oppression, Safiya Umoja Noble documents how Google search produces representational harms to various identity groups, including Black women. Noble argues that these harms arise because Google is a commercial actor embedded in a system of racial capitalism, rather than an “objective” content curator. To mitigate these harms, Noble argues that we must disentangle web search from narrow commercial ends by (1) providing regulatory protections for harmed groups, or (2) developing publicly-funded noncommercial web search. In “What is Digital Public Infrastructure?”, Ethan Zuckerman extends Noble’s argument for publicly-funded web search, embedding it in a broader policy vision for publicly-funded “digital public infrastructure” (DPI) developed with “civic values” in mind. In this essay, I will argue that Zuckerman’s DPI vision is promising but insufficient to address Noble’s harms for two reasons. First, Zuckerman’s articulation of DPI backgrounds racial justice concerns, opening the possibility of democratically-minded digital infrastructure that fails to correct representational harms. Second, I argue that DPI alone – racial-justiced oriented or otherwise – will not mitigate Noble’s harms unless paired with broader regulation or reform to check or restructure market governance of web search. Addressing these issues yields a vision of DPI that is more compelling for addressing Noble’s harms.

In Algorithms of Oppression, Noble shows how Google search has facilitated representational harms to multiple groups, most notably Black women. I define representational harms as harms that “represent some group in a way that reinforces harmful stereotypes or misconceptions”1. In the case of Google, Noble shows that (as of 2011) a search for the phrase “black girls” yields predominantly sexual, pornographic results. Such results reproduce and reinforce hypersexualized and commodified misrepresentations of Black women which are also present in traditional media. Noble notes that such dehumanizing misrepresentations ultimately play a part in maintaining broader systems of racial and gender subjugation2.

For Noble, representational harms from search arise fundamentally because Google is a private commercial enterprise, embedded in a system of racial capitalism wherein racism and sexism are profitable3. Even if Google’s ranking and advertising systems do not advance racism and sexism directly, they create a curation structure wherein well-financed commercial interests (such as the pornography industry) can advance harmful racial/sexual representations via investment in search engine optimization, or by bidding on advertising space4. Likewise, Google itself has weak commercial interests to police and correct problematic misrepresentations of marginalized groups that have limited economic & political power to contest its rankings.

Having shown that representational harms arise from Google’s commercial status, Noble’s normative proposals center around disentangling web search from narrow commercial ends. Noble outlines two main classes of policy solutions: (1) the provide protections approach5 and (2) the public provision approach. The provide protections policy approach recommends creating new mechanisms to ensure that both individuals and groups of people (e.g. Black women) have the ability to legally contest the contents of commercial search results6, either directly or with the help of a regulatory agency like the FTC. This policy vision involves leaving the market logics of web search in place, but providing protections and recourse for harmed groups. By contrast, the public provision policy approach involves building publicly-funded noncommercial search alternatives to Google, thus achieving a more radical decoupling of web search and commercial interests. Publicly-funded search could reimagine the basic design logics of web search7, and in general could ensure that search results are accurately rooted in contextual and historical understandings of race and gender, thus avoiding the representational harms outlined above.

In “What is Digital Public Infrastructure?”, Ethan Zuckerman extends Noble’s argument for publicly-funded web search, embedding it in a broader policy vision for publicly-funded “digital public infrastructure” developed with “civic values” in mind. For Zuckerman, DPI refers to the the digital systems necessary for us to engage in civic life online; this includes physical internet infrastructure as well as application-level systems like social media networks (e.g. Facebook, Twitter, YouTube) and discovery systems (e.g. Google). Like Noble, Zuckerman believes that our current digital infrastructures have given rise to a range of civic and social harms – representational harms from Google is one example; disinformation on Facebook is another. Further, Zuckerman agrees that these harms are due to the fact that our current infrastructure has been built primarily by for-profit corporations under the logic of “surveillance capitalism”8. Zuckerman’s policy vision extends but refashions Noble’s public provision approach, emphasizing that we should develop publicly-funded search systems that reflect our “civic values” and will “have a productive role in democratic societies”9, noting that transparency and auditability of algorithms and results are especially important for search.

Digital public infrastructure is a promising approach for addressing Noble’s harms because its focus on public-funding and “civic values” disentangles search from narrow commercial interests; however, I will now outline two reasons why DPI alone is insufficient to address Noble’s harms. First, Zuckerman’s articulation of DPI insufficiently foregrounds racial justice concerns. Where Noble’s policy solutions address specific race- and gender-based harms, Zuckerman writes more broadly of aligning DPI with our “civic values”. While this generalization is helpful in bringing multiple digital policy issues under a single umbrella, the cost is specificity about what problems need to be solved and how. Zuckerman would likely argue that aligning web search with our “civic values” entails ensuring that it is free from discrimination and representational harm10; however, his proposal deliberately abstracts away from articulating this. As American history shows, it is unfortunately very possible for democratic processes and publicly-funded infrastructure to nonetheless advance identity-based harms. For example, consider the publicly-funded but racially-discriminatory overpasses famously discussed by Winner (2017). In short, liberating web search from the constraints of racial capitalism through public funding may indeed create the possibility of mitigating discriminatory outcomes, but it is not yet to ensure it. Our visions for DPI should explicitly articulate how DPI will ensure racial and gender-based justice.

Second, producing DPI alone – oriented around racial justice or not – is unlikely to mitigate Noble’s harms unless paired with structural reform or regulation to check or restructure market governance of search. Producing a publicly-funded Google alternative that is free of racial bias does not yet address the problem of Google itself continuing to advance these kinds of harms. This is especially an issue if Google retains a dominant market position relative to any public alternative (this outcome is likely due to structural economic forces like network effects11). As such, DPI should be paired either with (1) additional regulation to mitigate existing harms from incumbents like Google, or (2) broader structural reform to the market incentives that produced the harms in the first place. Noble’s provide protections approach is an example of (1). Examples of (2) could include taxes on surveillant advertising (insofar as this is a key driver of harms), or forms of antitrust action that might help public search alternatives gain a foothold in the market.

Ultimately, DPI is a promising policy direction for addressing Noble’s harms (and others like them), as it successfully disentangles search from narrow commercial interests. However, DPI is insufficient on its own. To ensure that DPI delivers on its promises, it should be coupled with (1) an explicit commitment to racial and gender equity, and (2) additional reform or regulation to mitigate harms from incumbents like Google.

References

Noble, S. U. (2018). Algorithms of oppression. In Algorithms of oppression. New York university press.

Winner, L. (2017). Do artifacts have politics?. In Computer ethics (pp. 177-192). Routledge.

Zuckerman, E. (2020). The case for digital public infrastructure.

Zuckerman, E. (2020). What is digital public infrastructure. center for journalism and liberty”, available at: https://static1. squarespace. com/static/5efcb64b1cf16e4c487b2f61, 5, 1605639019414.

Footnotes

  1. This definition is from Crawford (2017) via Wade (2021) 

  2. For example, Noble writes: “Black feminist scholars have already articulated the harm of such media misrepresentations: gender, class, power, sexuality, and other socially constructed categories interact with one another in a matrix of social relations that create conditions of inequality or oppression” (33). 

  3. For example, Noble writes explicitly: “I intend to meaningfully articulate the ways that commercialization is the source of power that drives consumption of Black women’s and girls’ representative identity on the web” (33) — i.e. citing commercialization as the key driver of the representational harms she outlines. 

  4. For example, Noble writes: “Search results reflect the values and norms of the search company’s commercial partners and advertisers and often reflect our most demeaning beliefs, because these ideas circulate so freely and so often that they are normalized and extremely profitable” (35) 

  5. Outlined mainly in chapters 4 and 6 of Algorithms of Oppression 

  6. For example, Noble writes: “Ultimately, both individuals and communities are not sufficiently protected in Google’s products and need the attention of legislators in the United States” (123). 

  7. See e.g. pg. 180 of Algorithms of Oppression

  8. See e.g. page 8 of “What is Digital Public Infrastructure?”. I note that the term “surveillance capitalism” rose to popularity only after the publication of Noble’s book, but is highly consonant with her analysis. 

  9. “The Case for Digital Public Infrastructure”, pg. 6. 

  10. Indeed, I note that Zuckerman mentions discrimination in passing on pg 12 of “What is Digital Public Infrastructure?” 

  11. In short: for the same reasons that it is difficult for private sector competitors to challenge Google on the basis of consumer choice, it will be difficult for any public sector competitor as well.