Following the Black History month meeting and what this means for Learning Technologists (https://bit.ly/3U0JGPx), the ARLT SIG community met in November for something closer to the day-to-day job: anti-racist approaches in technology.
I provide in this blog post a short summary of the session supplemented with some examples, along with a summary of the Q&A session and additional links for you to read. Sadly, there is no captioning embedded in the recording, and YouTube captioning is one of those technological features that will fail a discrimination test. I will request for chapters to be added to the video so that you can read this blog and pick relevant questions to you and watch the video from that section if possible.
Coding Black Females & FutureLearn Course Summary
Coding Black Female (https://codingblackfemales.com/) is the organisation behind the Antiracist Approaches in Technology FutureLearn course (http://bit.ly/3ElbbOo). They have been around since 2017, and it has been a much-needed space for Black women in the tech sector. The slides (at the bottom of this post) and their website has further information on their mission and their aim.
The FutureLearn course highlights racist stereotyping in technologies (in particular consumer products) along with outlining concepts to be considered to create technologies that do not have biases into them. The course is split into 3 weeks with a reflection every week:
- Week 1 – What is racism? How does racism impact our everyday life? What issues you can face with technology, and how tech is impacted by social nuances and economic nuances? How systemic racism impacts technology here in the UK?
- Week 2 – Examples of antiracist technology. How are organisations tackling this? How do individuals impact the building of AR tech? What you as an individual can do so that racism does not roll out in the project you are working on. Week 2 also has some tech-for-good scenarios where people are building things that are positive for society.
- Week 3 – How to ensure we build tech without biases? How do you set up your team? Week 3 also has some suggestions of how, as part of a product cycle, to ensure everything that you build represents society and is not aimed at only one group.
Antiracism in Tech
I would like here to highlight here some common examples of how AI with racist algorithms. In 2015, Google Photos’ face recognition “Face grouping” has been the subject of criticism because of its racist algorithm – Google Photos Tags Two African-Americans As Gorillas Through Facial Recognition Software (https://bit.ly/3WDKf2m) and Google apologises for racist photo blunder (https://bit.ly/3HVZIXM). In addition, Google had restricted its AI recognition, thus a search for “black man” or “black woman” would only return pictures of people in black and white, sorted by gender but not race. This is, however, not the first time that racist AI was widely rolled out in consumer products. In 2009, Nikon’s famous “Did someone blink?” screen message would frequently pop up when an Asian person got photographed, despite being a Japanese company headquartered in Tokyo – Nikon camera says Asians are always blinking (https://bit.ly/3hKUIdH). Another Google feature, Google search autocomplete, has been in the headlines since early 2010 and we still hear of how discriminatory Google’s algorithm is – How Google’s autocomplete reveals racist, sexist and homophobic searches: Researchers claim search function ‘perpetuates prejudices’ (http://bit.ly/3WlYaKQ), Google Has a Striking History of Bias Against Black Girls (https://bit.ly/3PKSlEe) and Ten years on, search auto-complete still suggests slander and disinformation (https://bit.ly/3PQpaj5).
You might also recall another count of racism, this time closer to home, when in 2019 Cat Hallam simply wanted to renew her UK passport online (https://bit.ly/3YzWyP4).
During her talk, Liza highlighted that diversity is not the answer; you, in fact, need a team that has inclusion and equity. In your work environment, you need to incorporate an understanding of how (ethnicity) data can be used in order to make sure that we have an “even spread of individuals that represent society as a whole”. I could not but reflect on what this means to us in our day-to-day work. Are we taking representation into consideration when forming teams? Are we capturing ethnicity data? It is no surprise that time and again, within ARLT SIG, it has been highlighted that antiracism work in tech tends to come from the US, while we shy away from addressing systemic issues within the UK. Liza highlighted that “when we talk of racism in the UK, it is more than likely rushed over”.
Summary of Question & Answer
As backup chair, I had a few questions prepared (Q1, Q2 and Q4) and others were from the audience. So happy reading and most importantly happy understanding! Note: This is not a word-to-word captioning of the video.
Q1. In the UK, Black people represent 1.9% of the tech industry and Black women only 0.7%. Given that software project management is very cyclical, 1.9% Black people throughout the cycle of product design and development is very little.
That is why the US fixes racism in technology first, they record ethnicity, but in the UK we do not record ethnicity. Some level of consideration is taken in the product life cycle, however even this is questionable as we keep hearing about instances every year. For example, Google search autocomplete with racist prejudices. What does a diverse team mean? A point worth reflecting here is : does a team with two people of colour (but of the same ethnicity and perhaps different gender) count as diverse?
Q2. How to overcome the issues with Black people trying to fit in and not wanting to raise issues or looking for reaffirmation of their competence? How do we encourage the younger generation when there is only 1.9% Black people in the industry?
We need to raise awareness that it is not easy to get through to the career ladder, we need to have role models and we need to speak up. In the tech industry, there have been gender biases in the past (and even to some degree now). Black people are here and are present, but remain untapped because people choose to ignore that Black, competent people are here. We need to have conversations and show different role models, showcase different pathways to tech (not necessarily the traditional pathways). We also need to encourage people to stay in their posoitions. Because of racism, many people reach a crossroad whether they have to decide whether to leave or to stay. It is rekatively easier to get some people up the ladder, but it is difficult to make them stay, because of the burden of racism.
Q3. Experience of working with 10000blackinterns (https://www.10000blackinterns.com/)
There are many internship programs that help young people, in particular girls. However, we have to question where do these women go, where do they end up? Do these young women stay in the tech roles? Do they go sideways because they feel they have hit the glass ceiling even at the entry level? It is great that we are getting people to some junior positions, but then what happen to them? Women of colour still feel reluctant to speak, their expertise is still not called upon as much as the white male.
Q4. We live in a capitalist society, what can leaders do to influence political leaders to make the tech industry more equitable (although some laws have loopholes and laws are not entirely deterrent to racism)?
Networks help a lot, but at the same time we need to make people visible. We need to capture matrix on what is going on in organisations. Everybody should have the opportunity to speak up ?(if they feel comfortable doing that). In the US it is easier to lobby for certain things, they record ethnicity but there is still some other biases. In start ups of less than 200 people, they do not have to record ethnicity. There are many good good organisations e.g. Tech Talent charter (https://www.techtalentcharter.co.uk/home) It is also important to have people who are willing to stand up for you, you are stronger if you have the people help support you.
Q5. Black History Month
It is nice to highlight individuals, but Black people are Black throughout the year. It is also the same with Pride month. If you fall in a particular category, it remains so throughout the year, not just for one year.
Q6. Black people within Academia
There used to be very few Black people in Academia, but this still continues as there are still very few Black people in higher positions. There tends to be over-exploitation of Black people with expertise either to join committees or project groups, you need to pay the people to do these, not request free work. This is an issue in a lot of industries, but until we come to the point where we record ethnicity data, we will not come to a point where we can understand the real issue, there is a big problem of the visibility of Black people at mid-level in all industries, not just tech. Women (in general of any colour) they drop off.
- Academia was built for White, straight, and non-disabled males.
- Some tech solutions restrict the number of characters in people’s names, very much focused on Western European names which tend to be shorter. It is important to question who built the platforms for students to fill in their details? And then you get people wanting to shorten students’ names or call them by a different Western name.
- The term BAME (Black Asian Minority Ethnic) is misleading, it includes people of ethnicities who look White. Sometimes we need to categorise into Black and Brown. Grouping everybody together makes no sense, it hides the real statistics about what is going on about the ethnic background and obviously in bringing solutions. Hiding ethnicity under one big bubble is not a solution.
- Assessments have biases, for example, if you look at Applicant Tracking Systems you have to question who has built the system?
Recording & Presentation
Liza Layne’s presentation slides are available at: http://bit.ly/3UMJVO9
- Machine Bias Artificial Intelligence and Discrimination https://www.researchgate.net/publication/334721591_Machine_Bias_Artificial_Intelligence_and_Discrimination (https://bit.ly/3YUMzEr)
- Google Photos Tags Two African-Americans As Gorillas Through Facial Recognition Software https://bit.ly/3WDKf2m
- Google apologises for racist photo blunder https://bit.ly/3HVZIXM
- Nikon camera says Asians are always blinking https://bit.ly/3hKUIdH
- How Google’s autocomplete reveals racist, sexist and homophobic searches: Researchers claim search function ‘perpetuates prejudices’ http://bit.ly/3WlYaKQ
- Google Has a Striking History of Bias Against Black Girls https://bit.ly/3PKSlEe
- Ten years on, search auto-complete still suggests slander and disinformation https://bit.ly/3PQpaj5
- UK launched passport photo checker it knew would fail with dark skin https://www.newscientist.com/article/2219284-uk-launched-passport-photo-checker-it-knew-would-fail-with-dark-skin/ (https://bit.ly/3v8i3Jx)