The Importance of Decoding Unconscious Bias in AI

Despite its widespread adoption, Artificial Intelligence still has a long way to go in terms of diversity and inclusion.


We try to write a lot about the positive ways in which Artificial Intelligence and other technologies are impacting our world. It’s a subject close to our hearts as a company, and quite frankly, something that should be celebrated and shouted about given all the doom and gloom we’re so often bombarded with in today’s media.


From healthcare, and sustainable cities, to climate change and industry, investment in AI is making an impact in many areas. Applications of machine learning and deep learning help shape the trajectories of our daily lives, so much so that we are barely even aware of it.


However, all of this do-gooding aside, one of the biggest obstacles in AI programming is that of the inherent bias that exists within it. This progressive technology can at times lead us right back to the uneducated, discriminatory ideologies that are better left in the past. A place where race and criminality go hand in hand, where women are associated with kitchens and cooking, and men with the boardroom.


There are some cringe-worthy examples out there, such as the time when Nikon’s Coolpix kept asking an Asian family ‘Did someone blink?’, when taking pictures. These messages stopped coming up once they took a picture with a relative holding their eyes open wide…. Yes, that really happened.


And it doesn’t get much better. How about racial profiling in risk assessment modeling for criminal sentencing? ProPublica found that racially bias AI will predict a greater likelihood of reoffending for minorities due to their ethnic backgrounds.


And it’s not just limited to race. In the UK, a doctor was locked out of the women’s changing rooms at a Pure Gym, as their computer system recognised her job title as being male only. Or how about Googling CEO to find only pictures of white men – because, minorities or women cannot be CEO’s, right?


This list could go ON and ON. But you get the idea, so I’ll refrain.


For a long time, algorithms been imbued with these racial, gender, and other stereotypical bias, and what is extremely troubling is that this technology is shaping so many aspects of our daily lives.


The brilliant Joy Buolamwini dubs algorithmic bias as ‘the coded gaze’ in her 2016 TedTalk – if you haven’t seen this, you really need to! She discusses how her journey into working on tackling racial discrimination in AI began when a computer system only recognised her face when wearing a white mask.


This, along with the other examples, is down to the fundamental problem of limited data that the algorithms are being trained on. When your data is limited, and not a true representation of society, (such as not racially or gender diverse), you omit sections of our society from this technology.


Stephen Hawking is no stranger to voicing his concerns on AI, having recently been quoted, “Success in creating AI, could be the biggest event in the history of our civilization… But it could also be the last unless we learn how to avoid the risks. Alongside the benefits, AI will also bring dangers, like powerful autonomous weapons, or new ways for the few to oppress the many.” Oppression through algorithmic bias is a very real threat, and we need louder voices from sections of society with less power than those of the predominantly white, male Silicon Valley circles.


There is light at the end of the tunnel with many bright lights in the field of AI, like Joy, working to tackle these problems.


The non-profit American Civil Liberties Union (ACLU) has partnered with research institute AI Now, to tackle AI bias by addressing the very core of these algorithms. Researching how the AI is programmed is crucial to their work in a bid to accomplish algorithms free from bias. The ACLU work in protecting the rights of individuals in areas most affecting by AI discrimination; housing, credit and lending, and prosecution and the criminal justice system.


The shocking revelations of how unfair machine learning bias can be in criminal-sentencing, outlined in ProPublica’s report (link above), show just how needed regulation and investigation into the algorithms behind this discrimination is. Hopefully, the partnership between ACLU and AI Now will go some way in tackling these issues.


Similarly, the movement ‘Data For Black Lives’ is working to address issues of inequality and discrimination within systems like; financial and credit services, predictive policing and risk assessment sentencing. They aim to use “data science to create concrete and measurable change in the lives of Black people.” You can read more about the work they are doing at


Microsoft’s research group FATE (Fairness, Accountability, Transparency, and Ethics in AI), are working on the social implications of AI and carrying out research into artificial intelligence and machine learning to produce algorithms that are ethical. Part of this team, researcher Timnit Gebru, who co-founded the ‘Black in AI’ event, taking place at NIPS 2017. Part of tackling this issue is bringing it into the foreground and encouraging diversity within the field of AI itself.


We’re also seeing popular culture address these issues and use their platform to inspire a new generation of brilliant minds. The recent box office hit Black Panther hailing from the Marvel Universe shows Princess Shuri, a young black woman as central to the development of high technology for the most advanced nation on earth, Wakanda. This portrayal (by the very talented Letitia Wright, may I add), is extremely inspiring for women, especially women of colour, to enter into STEM fields.


The very advancement of AI and its ability to do good in the world goes hand in hand with its perception and profiling of people from different walks of life. If we continue to build discriminatory AI systems that systematically oppress certain communities within our society, then it will never truly achieve social good.


Charlotte McKee, Content and Social Media Specialist

2000px-Linkedin_icon.svg Twitter-icon


Looking for a job in Artificial Intelligence-

< Back to News