Confronting Bias

Additional Reading

Search Engine and Search Algorithm Bias

"People think of algorithms as simply a mathematical formulation but in fact algorithms are really about automated decisions"

Safiya Umoja Noble

Algorithms of Oppression: Safiya Umoja Noble

"Algorithmic bias, like human bias, results in unfairness. However, algorithms, like viruses, can spread bias on a massive scale at a rapid pace. Algorithmic bias can also lead to exclusionary experiences and discriminatory practices."

Joy Buolamwini

How I'm Fighting Bias in Algorithms: Joy Boulamwini

"So I'm asking you to remember that behind every algorithm is always a person, a person with a set of personal beliefs that no code can ever completely eradicate."

Andreas Ekström

The Moral Bias Behind Your Search Results: Andreas Ekström

Library Discovery Tool Bias

Library search tools and databases are not free from issues of bias. Many library tools rely on controlled vocabularies like the Library of Congress Subject Headings (LCSH) or the National Library of Medicine's Medical Subject Headings (MeSH). These controlled vocabularies are lists of approved terms that are assigned to articles or other resources contained in the database. These vocabularies are created by people, and thus are susceptible to human biases and prejudices. Although controlled vocabularies are continually reviewed and updated, there is a long history of racist and prejudiced subject headings in library search tools. Unless biased controlled vocabulary terms are contested by librarians, library users, or the general public, these terms will remain in use.

Library of Congress Subject Headings (LCSH)