
Algorithmic bias in search engines, on social media, and in online databases directly shapes the way that we find and interact with knowledge online. It can result in the silencing and erasure of diverse voices through practices such as shadowbanning, or the creation of filter bubbles or echo chambers, and can be used as a tool for social control through censorship and propaganda. On the other hand, online spaces can often be incredibly valuable in allowing voices who have been silenced to share their lived experiences. Finding and listening to these voices can help us in many contexts, from community-building and organizing, to personal learning and unlearning, to making evidence-based decisions in our professions. These facts exist in tension with one another, so what can we do?
This session will introduce participants to the practice of knowledge justice, which asks us to hold multiple truths, and multiple voices, in balance with each other. Through this lens, we can recognize and address the harms presented by algorithmic bias while engaging with knowledge online to seek out diverse voices in a thoughtful and responsible way.
Session duration: 90 minutes
Please click on the facilitators’ name in the session info to view their bio.
This session will be recorded and shared on the website and on our YouTube channel