Module 2: Real-world Examples of Unethical AI
Coded Bias
Coded Bias
Coded bias refers to the concept that you can unintentionally reproduce societal biases in the AI methods by encoding them in the data, which the AI then learns to reproduce. We will discuss this concept in class through sets of examples. The first comes from a movie of the same name and the second from a book discussing similar concepts.
Assignment 1: Coded Bias Movie
For this first assignment on learning about real-world examples of unethical AI, I want you to watch a movie. Mostly we will read books/papers this semester but this movie is a good introduction to why we need ethical and responsible AI.
OU has purchased the movie for any OU member (anyone with a valid login to OU libraries) to watch. The link is below but it will require you to sign in to watch the movie (after you sign it, it will take you to a mymedia link). Note, this movie is also available on netflix and other platforms and you are welcome to watch it on one of those platforms as well. The link below is available for free but the movie is the same! Please make this a social opportunity and watch the movie with your friends! Come prepared to discuss the movie in class.
- Watch the Coded bias movie: https://ou-primo.hosted.exlibrisgroup.com/permalink/f/bqqc6e/NORMANLAW_ALMA51643791110002042
- At least 2 groups must signup to present a summary of the movie.
- All other groups must submit at least 2 discussion questions about the movie (and not taken directly from the available discussion guide!)
- Everyone (by group) should bring examples of how you have seen Coded Bias in your everyday life
Assignment 2: Race After Technology
Ruha Benjamin‘s book Race After Technology is a fascinating look into how technology itself (and not just AI) can suffer from coded bias. Because the book is long, we are going to break it up and present parts of it to each other (though you are clearly encouraged to finish the full book yourself!)
- Everyone will read the Introduction
- Groups will sign up for sections 1, 2, 3 and 4. We will do this in day 1 of class when you form initial groups. Everyone will need to read one of the sections and be ready to listen and learn and discuss on the others.
- Section 1: Engineered Inequality
- Section 2: Default Discrimination
- Section 3: Coded Exposure
- Section 4: Technological Benevolence
- Every group should bring examples from the news or personal experience of the issues discussed in the section they chose.
MODULE MENU
[boldgrid_component type=”wp_nav_menu” opts=”%7B%22widget-nav_menu%5B%5D%5Btitle%5D%22%3A%22%22%2C%22widget-nav_menu%5B%5D%5Bnav_menu%5D%22%3A%226%22%7D”]
