Time to ‘slow down and fix things’

In last week’s blog post I spoke about the need to challenge the overarching (male) narratives embedded in our key societal structures and the critical need to embrace a plurality of thinking and views. In recent weeks, good friend and old colleague @corney_sarah and I have been discussing (usually over a pint or two) the ‘world views’ and possible biases that are being programmed into the algorithms that are increasingly dominating our world; algorithms that make decisions on our behalf and create ‘truths’. Of course, there is much to be said about the impact of technology on our lives (Cambridge Analytica anyone?!), and on our jobs. In this guest blog, Sarah highlights the need for greater diversity in the tech industry and time to put its house in order. Enjoy.

As a lesbian, I’ve benefitted from the greater levels of legal equality and the shift in societal attitudes over the past 50 years. Last summer we celebrated, and reflected on how far we’ve come (and have still to go) since the the decriminalisation of (male) homosexuality in the UK in 1967.

And as a woman I’ve benefited from the struggles and gains of the feminist movements. In February there was a similar period of taking stock, with the 100th anniversary in the UK of the Representation of the People Act 1918 and the (partial) enfranchisement of women voters. But as Polly Toynbee, writing in the Guardian in February (Will women be equal to men in 100 years?) reminds us ‘Liberation for women means digging up the roots of human culture, nothing less’. (Polly Toynbee)

So it was with dismay that I read recently of the rise of AI in the recruitment industry. The idea behind these programs is that a good prospective employee looks a lot like a good current employee. But in a workforce that still disproportionately understands a ‘good employee’ as male, white, straight, middle class, and non disabled, when AI turns that data into a score and compares it against prospective employees, who do you think misses out?

At this point in human history we’re rapidly refashioning human culture, to one that is based on technology, founded on machine learning and artificial intelligence.. Are we laying down new roots of inequality, roots that might be just as tenacious and insidious? Roots that might take an historic struggle to dig up?

After a slew of negative press and scandals (the latest the deeply disturbing revelations of Cambridge Analytica) we’re discovering that our technology isn’t ethically neutral – it’s shaped by the worldview of those who build and finance it.

I work in what we might think of as the ‘empathy’ side of tech – leading teams that build websites and online tools that very much have the user experience at its heart – in design, testing and delivery. I believe passionately in building technical solutions from a position of deep empathy for your end users – all your end users. And I’m increasingly alarmed by the rise of biased tech, particularly the algorithms and AI that we’re building to run our societies, that encode not just a poor user experience but an iniquitous user experience, particularly for women and minority communities.

The trouble with tech (with machine learning) is, as Sara Wachter-Boettcher writes in her book Technically Wrong that ‘the biases already present in our culture are quietly reinforced’. Tech inequality used to mean inequality of access and skills, but we increasingly understand it to mean how historical prejudice is being hard wired into the very system itself.

Investors are making a big bet that AI will sift through the vast amounts of information produced by our society and find patterns that will help us be more efficient, wealthier and happier.

The Guardian, Rise of the Racist Robots

But a few glimpses of the ghost in the machine allude to something darker: a Google image recognition program that tagged the faces of a photo of a group of black friends as ‘gorilla’; a Google ad that shows more higher-pay executive jobs to male job seekers than female; the now-infamous COMPAS program that disproportionately discriminates against black men in the US criminal justice system.

The problem is that these machines learn from vast sets of historical, and therefore often biased, data; they don’t invent a fairer future, rather they codify our unequal past. Without immediate-term intervention they could replicate by orders of magnitude ‘the sort of large-scale systemic biases that people have spent decades campaigning to educate or legislate away’ (The Guardian, Rise of the Racist Robots)

The problem is exacerbated further when these programs are shared with the wider tech community as open source code, using them as the foundation to build further products. For example Google’s Word2vec biased word-embedding program, designed to reconstruct the linguistic context of words. In their paper ‘Man is to Computer Programmer as Woman is to Homemaker?’ Bolukbasi et al. argue these word embeddings exhibit ‘female/male gender stereotypes to a disturbing extent’ and their widespread use ‘amplifies these biases’.

These programs not only encode the biases of the training data sets, but also the biases of those working in the tech industry. An industry that is emphatically white, male (and in the US in particular) from a handful of elite universities. A tech industry that prioritises, indeed lionises, programming over all other (particularly liberal arts) skills.

We increasingly have a trust issue with the tech industry. And as societal disquiet regarding its cavalier attitude to personal data and to ethics mounts, it’s starting to experience a backlash, from consumer activism (e.g. the #deleteFacebook campaign) to negative financial impact.

The threat of iniquitous tech needs to be addressed from many angles, with a web of interventions including legislative – there are some legal protections already in place or about to be (e.g. GDPR), but the law rarely keeps pace with technological change – and industry self-regulation (see for example the Institute of Business Ethics briefing February 2018).

There is increasing pressure on the tech industry to take responsibility for the monster they have and are creating. The mantra of ‘creative disruption’ and ‘move fast and break things’ is disruptive, is breaking things: personal privacy, freedom and democracy. And some working in the tech industry are slowly beginning to hold themselves and their colleagues to account (see @MariesaKDale’s Technologists hippocratic oath). Tech academics and thought leaders are also speaking out and searching for solutions to biased algorithms.

As Sara Wachter-Boettcher reminds us in her book Technically Wrong, User Experience (or UX) helped to hoodwink us into thinking that tech was our friend, to give up our personal data, with its intuitive interfaces and cutesey micro content. UX needs to grow up and take responsibility for inclusive user design and ethical testing. Designing for everyone, not just personas and defaults, and ethically stress testing AI-generated outcomes as real world scenarios: ‘would the result be the same if the person was gay, disabled, etc.?

But investing artificial intelligence with emotional intelligence isn’t easy. These are complex programs. Some of it can be done programmatically: The Turing Institute’s Counterfactual Fairness Project is leading the way on this thinking; Anupam Datta has designed a programme that tests for bias in recruitment AI. But some of it is down to the organisational culture itself: the tech industry needs to invest in diverse and inclusive teams that are more sensitive to bias and more responsible in the way that they design the programs we all increasingly rely on. ‘If your teams are diverse they’re much more likely to spot if an algorithm’s outputs disproportionately affect marginalised communities’ (Sara Wachter-Boettcher).

We need to encourage more women and minorities into tech. To quote the late Karen Sparck Jones (one of the architects of computer programming) ‘computing’s too important to be left to men’. Professor of Computer Science Wendy Hall Jones writes ‘for the good of society, we cannot allow our world to be organised by learning algorithms whose creators are overwhelmingly dominated by one gender, ethnicity, age, or culture’.

The tech industry must nurture organisational cultures that encourage and support ethical decision making at every level. It needs to hire not just for a diversity of cultural and racial backgrounds, but also for a diversity of ideas and thinking. Product teams that include programmers but also people with arts, social sciences and humanities training, who are better able to understand the historic and cultural context of the training data, who are better able to spot unconscious bias and who can deliver an ethical user experience. ‘From those differences will come a broader characterisation of the problems we face, and wider range of creative approaches to their solution.’ (Wendy Hall Jones)

I will remember that there is art to technology as well as science, and that empathy, craft, and remaining mindful about the consequences of my decisions outweigh the importance of my technical knowledge, the impulse for financial benefit, or allure of status. (Technologists Hippocratic oath)

Which brings us full circle back to those biased hiring algorithms, and why the people profession needs a strong view on this. When diversity and inclusion is part of the solution to building more ethical tech, to securing a fairer future, we need to be championing better, more diverse, more human recruitment.

Let’s hope we’re reaching an inflection point, where societies, governments and consumers begin to respond to the issue of biased and unethical tech. We must demand that the tech industry takes responsibility for the data it collects, how it processes it and the unethical and unequal outcomes of the AI that’s being built upon it. To regain our trust, the tech industry now needs to slow down and fix things.

Featured image is of Mary Jackson (1921-2005) NASA computer programmer and their first female engineer. Mary Jackson was also NASA’s Federal Women’s Program Manager (1979-1985), where she ‘worked hard to impact the hiring and promotion of the next generation of all of NASA’s female mathematicians, engineers and scientists’ nasa.gov