Module 2: Real-world Examples of Unethical AI
Automating Inequality
Automating Inequality
Continuing the theme of exploring the ways in which AI can go wrong, we will explore Virginia Eubank‘s book Automating Inequality. This book focuses on the unequal treatment of many automation and AI tools in monitoring people with lower socioeconomic status.
As with the last book, we will all read a common part of the book and then we will have different subgroups present and discuss the book overall. Your discussions groups can signup for the different sections in class the week before.
Assignment 1: Automating Inequality
- Everyone will read the introduction and complete the reading declaraation
Assignment 2: Automating Inequality
- With everyone having read the introduction, every group needs to signup for two chapters from the list below. Signups are on canvas so that we can make sure at least 2 groups sign up for every chapter (every chapter needs one presenting group and at least one discussion group)
- Chapter 1: From Poorhouse to Database
- Chapter 2: Automating Eligibility in the Heartland
- Chapter 3: High-Tech Homelessness in the City of Angels
- Chapter 4: The Allegheny Algorithm
- Chapter 5: The Digital Poorhouse
- When you sign up for a chapter, you need to be ready to summarize it for the class (turn in a few short summary slides on canvas), be ready with discussion questions, and bring any case studies from the news or your real experience that are relevant to the chapter.
MODULE MENU
[boldgrid_component type=”wp_nav_menu” opts=”%7B%22widget-nav_menu%5B%5D%5Btitle%5D%22%3A%22%22%2C%22widget-nav_menu%5B%5D%5Bnav_menu%5D%22%3A%226%22%7D”]
