WIN NYC x R/GA: The Role of Bias & Ethics in AI

Humans are inherently biased. Whether our biases stem from a limited point of view, unintended exclusion of certain considerations, or simply the tendency to see what we want to see rather than the whole picture, bias is an unavoidable feature of life. Since AI is developed by humans, our own biases can be reflected in how we develop AI tools, including the inputs that train algorithmic systems. In the case of AI, bias may not always appear as obvious prejudice and can go undetected. So what happens when we unknowingly produce machine bias in systems that are already impacting many aspects of modern life?

As AI becomes increasingly a part of our lives and our work, it’s more important than ever that we understand how to spot bias and fix it. Especially as women, as we are often the subject of this bias.
— R/GA Team
IMG_7662.JPG

WIN NYC partnered with R/GA to discuss the implications of bias in AI. R/GA Group Creative Director and AI expert, Jenna Niven, along with Christine Creamer, Associate Director of Business Transformation and lawyer by background, highlighted the importance of detecting bias in AI and how to address bias through ethics. The audience received a rundown of AI and ethics and had the chance to put it to practice in breakout sessions.

Here are three key takeaways:

1. AI is only as good as its inputs

Jenna Niven compared an AI model to baking cake. You can imagine if there was something wrong with your ingredients (the data you select), no matter how well you follow the recipe (rules for the algorithm) your cake (the output) won’t be great.

Credit: Nadia Piet

Credit: Nadia Piet



2. Bias can go beyond human bias

Bias in Data Collection, the ingredients

  • Sample Bias/Statistical Bias: choosing the data that isn’t large enough or representative enough to represent the population of interest.

  • Measurement Bias: value distortion, which occurs when the tool used to measure/observe the data causes skewed data.

  • Observer Bias: tendency to see what we expect to see in the data.

Bias in AI Design, the recipe

  • Prejudice Bias & Implicit Bias: cultural judgements on gender, race, age, social class baked into the algorithm.

  • Exclusion Bias: excluding features from a data set based on the goal to clean the data (this can be on purpose).

3. Ethics should play a role in every step of the AI development process

  • Ideation: What is the impact of creating this tool/algorithm? What is the potential risk to society and individuals? How much human intervention is recommended?

  • Data design: What is the quality of the data, what is the source of the data set collected, is it reputable? Is the data set diverse, is there a risk of historical bias in the data set? Is there a large enough sample to represent a diverse set of variables? Has the data set been repaired? What needs to be done to ensure against technical bias?

  • Testing: Which industry, services, or social rituals might be disrupted by this? What consequences might this have on culture, politics, economics, etc.? What’s the best and worst case scenario? Who benefits, who suffers?

IMG_7635.JPG

Here’s a look at one of the scenarios discussed:

Screen Shot 2019-12-05 at 8.26.29 PM.png


Can you spot the ethical implications?

This scenario could potentially cause harm to humans.

  • The outcome of a false negative can put lives in danger, and the outcome of a false positive can put a strain on the healthcare system. 

  • The data was sourced from a population aged over 65. As a result, there is Sample Bias, an inaccurate representation of users that may use the app.

  • One way to address these issues could be to bring in a cardiologist to work in conjunction with the AI. For example, the app could alert the user of an irregular heart beat, and leave the diagnosis to the cardiologist.



Thank you to WIN Ambassador Stef Hoffman and the R/GA team for hosting an impactful event and to our WIN NYC Community for their inputs and thoughtful discussion. See you at the next one!

Interested in learning more? Here’s a reading list from the R/GA team:


Written by Marcela Madera

Photos taken by Annie Chen

WIN: Women in Innovation Copyright (c) 2018 All rights reserved. This [content] may not be reproduced or repurposed without written permission from WIN: Women in Innovation (501(c)3). This [content] is provided for your personal use only.
















































WIN Women