?Is Machine Learning Sexist

We’ve been hearing a lot about diversity and inclusion recently, and one of the areas that a lot of people are particularly interested in and excited about is applying machine learning to eliminate bias. Done right, this can be a huge boost supporting our efforts to go beyond bias across all areas of the organization. But there are potential pitfalls, as well; if not done right, machine learning can actually make your business more biased. Let’s look at Google’s word2vec, for example. Using a millions-large set of Google News data, Google researchers extracted patterns of words that are related to each other. By representing the terms in a vector space, they were able to deduct relationships between words with simple vector algebra. For instance, the system can answer questions such as “sister is to woman as brother is to what?” (sister:woman :: brother:?) correctly with “man.” But therein lies the challenge of these rules: Because the system is trained with existing news, it will also follow the very bias in those articles. And in the Google News set, those articles proved to be shockingly biased.

تصویر مربوط به مقاله "آیا یادگیری ماشین جنسیت‌گرا است؟

reference