It is a wide known fact within the industry of IT that there’s not a lot of diversity among people who build the internet. Why is this a problem and why should we address it?

Technology is everywhere. We use technology to communicate with our peers at work, with our families, with our friends. We use technology to search for information, we use it to get our news, we use it to learn and to grow. Being such an omnipresent factor in our lives, in everyone’s lives, it is imperative that technology is built for everyone. Moreover, it’s important that technology is built by everyone.

While it is true that most people are born naturally empathetic, there’s only so much our empathy can go. To give a silly yet relatable example, last weekend we forgot to purchase vegan sweets for an event. The reason being: it’s always been our (then absent) vegan friend who thought about those things. One can make an extra effort to be empathetic and walk in someone else’s shoes, but not sharing their reality only allows us to do it to some extent.

Of course, the lack of empathy when building a product can go beyond sensitivities and affect functionality as well. A clear example of that are facial recognition algorithms. Take the case of Joy Buolamwini, an African-American MIT student, whose face was not being consistently recognised by the face-detection algorithms she was using to complete her studies. In order to test her assignments, she even had to recur to wearing a white mask to increase contrast in low-light environments and have her face detected.

Does this mean that whoever created the face detection algorithms is racist, or that the algorithm has a racist bias? Not at all. Most face detection programs use artificial intelligence, where a neural network needs to be trained with a set of samples (in this case, faces), that will allow it to determine patterns to match against. The main cause for black faces not being recognised, or Asian eyes detected as closed, is that the set of samples used for training the neural network was not diverse enough.

While it can seem hard to, as individuals, influence how a phone screen blocker detects Asian eyes or how crime prevention algorithms identify suspects, the truth is that we all have a part to play. Diversity is key, and we all can start by encouraging others to become involved. Examples of this are Rails Girls and Django Girls among others, which are organisations aimed at increasing the proportion of women in tech, and Black Girls Code, which aims to increase the number of women of color in the digital space. Another great example is the Algorithmic Justice League, created by the aforementioned Joy to highlight algorithmic bias.

If you feel identified with any of these stories, get involved. If you ever found it difficult to use an app or website due to your ethnicity, age or disabilities, get your community involved. Educate them, attract them to the industry. Increase diversity in the development teams and in the test groups. If you didn’t, if you’ve never had any struggles at all, make a special effort to become aware of social bias. Start by looking at your surroundings. Inspect the company you work at and analyse whether it’s diverse enough. Encourage diversity. Improve tech.