Any examination of bias in AI needs to recognize the fact that these biases mainly stem from humans’ inherent biases. The models and systems we create and train are a reflection of ourselves.
4 Ways to Address Gender Bias in AI
Any examination of bias in AI needs to recognize the fact that these biases mainly stem from humans’ inherent biases. The models and systems we create and train are a reflection of ourselves. So it’s no surprise to find that AI is learning gender bias from humans. For instance, natural language processing (NLP), a critical ingredient of common AI systems like Amazon’s Alexa and Apple’s Siri, among others, has been found to show gender biases – and this is not a standalone incident. There have been several high profile cases of gender bias, including computer vision systems for gender recognition that reported higher error rates for recognizing women, specifically those with darker skin tones. In order to produce technology that is more fair, there must be a concerted effort from researchers and machine learning teams across the industry to correct this imbalance. We have an obligation to create technology that is effective and fair for everyone.