I was rereading Algorithm of Oppression by Safiya Noble and even given the warning from (“…a book written about algorithms or Google in the twenty- first century is out of date immediately upon printing”) I find myself amazed by the foresight and clarity it provides. This excerpt is from the section on The Power of Algorithms.
Part of the challenge of understanding algorithmic oppression is to understand that mathematical formulations to drive automated decisions are made by human beings. While we often think of terms such as “big data” and “algorithms” as being benign, neutral, or objective, they are anything but. The people who make these decisions hold all types of values, many of which openly promote racism, sexism, and false notions of meritocracy, which is well documented in studies of Silicon Valley and other tech corridors.
These human and machine errors are not without consequence, and there are several cases that demonstrate how racism and sexism are part of the architecture and language of technology, an issue that needs attention and remediation. In many ways, these cases that I present are specific to the lives and experiences of Black women and girls, people largely understudied by scholars, who remain ever precarious, despite our living in the age of Oprah and Beyoncé in Shondaland. The implications of such marginalization are profound. The insights about sexist or racist biases that I convey here are important because information organizations, from libraries to schools and universities to governmental agencies, are increasingly reliant on or being displaced by a variety of web-based "tools" as if there are no political, social, or economic consequences of doing so. We need to imagine new possibilities in the area of information access and knowledge generation, particularly as headlines about "racist algorithms" continue to surface in the media with limited discussion and analysis beyond the superficial.
That was 2018! You could swap out the word “algorithm” for any AI-related concept today, and this book would still hit the mark. That is not to say this is just a Google problem, in a way every major LLM is careening into the same set of problem. It's as if we're plagued by intentional and collective amnesia; our failure to embrace systemic thinking leads us to repeat the same errors every few years, and we ask our most vulnerable citizens to bear the consequences.